In February 2021, Uber lost a judgement in the Netherlands where it was challenged over the alleged ‘robo-firings’ of drivers. The Court of Amsterdam ordered Uber to reinstate six drivers who claim they were unfairly terminated “by algorithmic means.” Uber was also ordered to pay the fired drivers compensation. Uber had until March 29 to comply with the order but failed to do so.
What happened in this case?
Drivers working for Uber use facial identification software to access the Uber system. The tech is designed to check and verify that driver accounts are not being used by anyone other than licensed individuals.
To work, the tool requires drivers to submit a photograph of themselves, which is checked against their driver’s licence. Once up-and-running, drivers are regularly prompted to take real-time selfies of themselves for verification. These facial checks are then matched against the account holder’s profile picture. If a driver fails the ID check, their account is immediately suspended. However, some drivers claimed that this technology cost them their livelihoods as the software was incapable of recognising their faces.
A challenge was made under Article 22 of the European Union’s General Data Protection Regulation (GDPR). This Article provides protection for individuals against purely automated decisions with a legal or significant impact.
However, this case is not as clear cut as it looks. The judgement, which was issued in late February, was done by default and Uber claims it was not aware of the case and therefore did not contest it. Uber also failed to comply with the order and claims this is because it was not aware of the ruling.
What happens next?
As Uber believes that the judgement in this case was not correctly served, it is now making an application to set the default ruling aside and have its case heard. The Worker Info Exchange (WIE), which is supporting the litigants along with the App Drivers & Couriers Union (ADCU) has denied that the correct procedures were not followed but has welcomed the opportunity to challenge Uber in court.
A wider problem for Uber
The issue of algorithmic firings isn’t likely to disappear for Uber. Especially as its facial identification software is being accused of discriminating against people with darker skin.
At Keller Postman UK, we are supporting Uber drivers in England & Wales who have GDPR concerns over Uber’s facial recognition software, algorithmic accountability, and automated decision-making processes. If shown to be true, this is not only a worker’s right issue, but also a clear breach of the General Data Protection Regulation (GDPR).
In our discussions with drivers we have heard how simply growing or shaving a beard can lead to a failed check. After a failed ID test, some drivers have been threatened with termination, had their accounts frozen and left unable to work, or even permanently fired. And, while Uber claims that it carries out manual human checks against failed verifications, the drivers allege that the process is automated and that they are left without any right to appeal.
To make a bad situation worse, we have heard of cases where an Uber driver has been fired following a failed ID verification and lost his job, his Uber licence, and his private license.
Facial recognition software and GDPR
Commenting on this case, Kingsley Hayes, our head of data breach, said:
“Article 22 of the GDPR concerns “Automated individual decision-making, including profiling”. And it is here that Uber is likely breaking the law. Under this legislation, people “have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”.
“While this does not apply “if necessary, for entering into, or performance of, a contract between the data subject (Uber drivers) and a data controller (Uber)”, the entering of a contract is not good enough reason for negatively affecting an individual where special categories of personal data are involved. According to the GDPR, biometric data – such as facial images – constitutes a ‘sensitive’ category of personal data.
“In short, the processing of biometric data, and the use of automated individual decision-making, including profiling, are only justified in very explicit circumstances. By discriminating against BAME drivers and automatically making decisions that harm them, Uber’s technology is not GDPR compliant. The latest judgement should serve as a stark warning to organisations using technology in this way.”
Taking legal action against Uber
To take action against Uber, drivers in England & Wales can register with us in confidence and tell us about their experience. We act on a no-win, no-fee basis.