Uber drivers union asks EU court to overrule ‘robo-firing’ by algorithm


A union representing Uber drivers in the U.K. has filed a second legal challenge against the ride-hailing giant in Europe, this time arguing that Uber’s alleged “robo-firing” practices contravenes Article 22 of the EU General Data Protection Regulation (GDPR), which seeks to protect individuals from automated decision-making.

The action, which has been filed in the District Court of Amsterdam where Uber’s European HQ is located, was initiated by three U.K.-based former Uber drivers with the backing of the App Drivers & Couriers Union (ADCU), a U.K. trade union for drivers and couriers who work for app-based companies, alongside the Worker Info Exchange, a non-profit organisation aimed at helping gig economy workers. A fourth driver from Portugal is also joining the challenge with support from the International Alliance of App Based Transport Workers (IAATW).

This also represents the latest in a line of high-profile cases that have thrust algorithms into the spotlight over potential biases and their lack of accountability. Last year, Goldman Sachs hit the headlines following allegations of gender discrimination in the algorithms used to determine credit limits in the new Apple credit card.


This is the second such case filed by the ADCU in the last few months, after it sued Uber on behalf of drivers seeking data on what it calls “secret profiling” carried out by Uber. They argue that Uber withholds key data from drivers such as metrics it uses to monitor their performance, which is what enables Uber to exert “management control” over drivers. This case is due to be heard on December 16.

The latest challenge, which will run in parallel, is very much related, insofar as it is looking to use GDPR regulatory provisions to counter automated decision-making where humans have minimal oversight. Article 22 of the GDPR states:

The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

At the crux of the ADCU’s latest case is how Uber allegedly kicked drivers off its platform using algorithms, accusing the subjects of “fraudulent activity” without any proper right of appeal. Although the filing acknowledges Uber’s argument that it does use “specialized employees” to assess account deactivations, the drivers’ lawyer, Anton Ekker, argues that Uber has not “further substantiated that this constitutes meaningful human intervention” and that the employees have been trained to understand “how the artificially intelligent system works.” The document reads:

In particular, Uber has not demonstrated that the employees involved in automated decision-making:

– Have a meaningful influence on the decision, which means, among other things, that they must have the ‘authority and competence’ to oppose this decision;
– ‘Weigh’ and ‘interpret’ the recommendation of Uber’s artificial intelligence system, considering all available data and taking into account additional factors;
– Can predict how the ‘output’ of the system will change if the ‘inputs’ are adjusted;
– Are able to determine which input contributed the most to a specific output;
– Are able to determine when the output is incorrect


The court filing notes that in each of the four cases, the drivers were “dismissed after Uber said its systems had detected fraudulent activity,” something the drivers deny. Moreover, the document alleges that Uber has never provided the drivers with any evidence to support its claims. The ADCU affirms that it believes that Uber is actually concealing performance-related deactivations behind allegations of fraud, including situations where a driver may log themselves out of the Uber app when demand is low and surge pricing is not in place.

This is where the two separate legal challenges overlap, as the center of both cases hinge on allegations that Uber doesn’t give its drivers access to any meaningful data, including performance metrics, that its automated systems use to make decisions.

“We contend that Uber has automated the process of deactivating drivers to such an extent that it is subject to abuse and error — as is the case with the claimants — and that there has been no meaningful human intervention,” James Farrar, director of the Worker Info Exchange told VentureBeat. “In essence, these are robo-sackings without proper review or right of appeal. As such that is a violation of the law under GDPR.”

The ADCU is also asking for other Uber drivers and couriers who have been impacted by deactivations to register and join “potential future action,” while it has launched a crowdfunding to raise £20,000 ($26,000) in the next three weeks to fund the legal action.

VentureBeat has reached out to Uber for comment, and will update here if or when we hear back.

The audio problem:

Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences. Access here

Credit: Source link