Local Government Lawyer

Government Legal Department Vacancies


The Equality and Human Rights Commission (EHRC) has been granted permission to intervene in an upcoming judicial review examining whether the Metropolitan Police’s use of live facial recognition technology (LFRT) complies with human rights law.

The regulator said that while it “acknowledges the potential value” of LFRT to policing, it believes the Met’s current policy governing its use is incompatible with Articles 8 (right to privacy), 10 (freedom of expression), and 11 (freedom of assembly and association) of the European Convention on Human Rights.

It claims that LFRT can be “intrusive”, especially so when used on a large scale, and warns that its use at protests could have an effect on individuals’ rights under Articles 10 and 11.

The regulator noted: “Data shows that the number of black men triggering an ‘alert’ is higher than would be expected proportionally, when compared to the population of London. The EHRC welcomes that the Met, since 25 July 2024, has adopted a minimum accuracy threshold which it has said will limit the adverse impact on certain protected groups.”

“However […], the accuracy of the technology is paramount and even low error rates can translate to significant numbers of false identifications when using large watchlists.”

Live facial recognition technology captures and analyses the faces of individuals passing in front of real-time CCTV cameras. It extracts unique biometric data from each face and compares it against a “watchlist” of people sought by the police.

The watchlists often contain thousands of individuals. The Met Police has announced plans to use the technology to police major events such as Notting Hill Carnival.

As it stands, there is no specific domestic legislation regulating police use of LFRT. Instead, police rely on common law powers.

The EHRC noted that in the 2020 case, R (Bridges) v Chief Constable of South Wales Police, the Court of Appeal found that the use of LFRT at the time was unlawful, breaching privacy rights and the Equality Act 2010.

Since the Bridges ruling, LFRT has been deployed more frequently, integrated into CCTV networks, and used with larger watchlists.

The EHRC’s submission states that these developments mean the use of the technology “poses a threat to human rights”.

The regulator’s submission also highlights international legal and policy developments relating to LFRT and AI regulation, such as the EU AI Act, which classifies LFRT for law enforcement as “high risk” and says it should be used only when strictly necessary and subject to safeguards.

The claimant in the case, Mr Thompson, is bringing the judicial review after he was wrongly identified by LFRT.

John Kirkpatrick, Chief Executive of the Equality and Human Rights Commission, said: “Live facial recognition technology is a tool which, when used responsibly, can help to combat serious crime and keep people safe. But the data this technology processes is biometric data, which is deeply personal.

“The law is clear: everyone has the right to privacy, to freedom of expression and to freedom of assembly. These rights are vital for any democratic society.

“As such, there must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. We believe that the Metropolitan Police’s current policy falls short of this standard. The Met, and other forces using this technology, need to ensure they deploy it in ways which are consistent with the law and with human rights.”

A spokesperson for the Metropolitan Police said: “We believe our use of LFR is both lawful and proportionate, playing a key role in keeping Londoners safe. We welcome the Equality and Human Rights Commission’s (EHRC) recognition of LFR’s potential in policing.

“The Court of Appeal has confirmed the police can use LFR under common law powers, with the Met carefully developing a policy to operate the technology in a way which protects people’s rights and privacy.

“As part of this model, we have strong safeguards in place, with biometric data automatically deleted unless there is a match.

“Independent research from the National Physical Laboratory has also helped us configure the technology in a way that avoids discrimination.”

Lottie Winson

Jobs

 

Poll


 

Events

GDPR Update - Act Now
19-09-2025 9:30 am

Directory