Computers now have the ability to mimic the problem solving and decision making capabilities of a human brain. The law has always battled to reconcile its application with technological advances and it is now the turn of the Equality Act to enter the ring with Uber’s facial recognition software in one corner and the application of AI in the other. The central question to be determined is whether the algorithmic decision-making has led to indirect discrimination and drivers being unfairly dismissed.
The App Drivers & Couriers Union and the Independent Workers Union of Great Britain (IWGB) have filed a claim at the Central London Employment Tribunal on behalf of three drivers, Pa Edrissa Manjang, a former UberEats courier and Imran Javaid Raja, a former Uber private hire driver and a third anonymous driver. The claim against Uber alleges that the company’s automated facial-verification software is not able to effectively identify people with darker skin. Drivers have been dismissed on the basis of failed verification attempts, without subjecting the data to a human review.
For Raja, the failed facial recognition checks lead to his dismissal and Uber reporting him to Transport for London who revoked his private hire driver and vehicle licence. Similarly, Manjang was dismissed by Uber after failing the facial recognition checks, despite him requesting that the images be reviewed by a human.
A spokesperson for Uber said: “Our Real-Time ID Check is designed to protect the safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel.” However, it is suggested that the software has a 20.8% rate of failure for darker-skinned female users and a 6% rate for darker-skinned males. For light skinned men the percentage is zero.
The Equality Act protects UK employees from discrimination on the grounds of protected characteristics, such as sex, race, age, and disability and in this case, the allegation is that the drivers affected have been treated less favourably on the grounds of race. If a company’s applies a provision, criterion or practice (PCP) to all staff which places a cohort of staff with a particular protected characteristic at a disadvantage, when compared with someone who does not share their protected characteristic, there may be a case for indirect discrimination.
In relation to the algorithmic processing of data, individuals also have the protection of the UK GDPR. Article 22 severely limits a data controller’s ability to make decisions based solely on automated processing without human intervention.
The Equality and Human Rights Commission (ECHR) and the Worker Info Exchange are supporting the workers claims. Henry Chango Lopez, the General Secretary of the IWGB stated, “Uber’s continued use of a facial recognition algorithm that is ineffective on people of colour is discriminatory.”
The preliminary hearing is listed for February 2022, ding round one.
If you have any questions regarding direct or indirect discrimination, or how algorithmic decision-making may affect employment, please contact Linky Trott, or any other member of the Employment team.
If you aren’t receiving our legal updates directly to your mailbox, please sign up now
Please note that this blog is provided for general information only. It is not intended to amount to advice on which you should rely. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content of this blog.
Edwin Coe LLP is a Limited Liability Partnership, registered in England & Wales (No.OC326366). The Firm is authorised and regulated by the Solicitors Regulation Authority. A list of members of the LLP is available for inspection at our registered office address: 2 Stone Buildings, Lincoln’s Inn, London, WC2A 3TH. “Partner” denotes a member of the LLP or an employee or consultant with the equivalent standing.