Rights groups have reacted angrily to the news that the government is expanding police use of live facial recognition (LFR) without adequate legislative safeguards.
The Home Office yesterday announced the deployment of 10 new LFR vans to seven forces across the country: Greater Manchester, West Yorkshire, Bedfordshire, Surrey and Sussex (jointly), and Thames Valley and Hampshire (jointly). London’s Metropolitan Police and South Wales Police already use the controversial technology.
Privacy group Big Brother Watch argued that the move represents a “significant expansion of the surveillance state.”
Interim director Rebecca Vincent added: “The Home Office must scrap its plans to roll out further live facial recognition capacity until robust legislative safeguards are established.”
Read more on LFR: Police Use of Facial Recognition Ruled Unlawful in World-First Case
Those safeguards have failed to materialize, despite police use of LFR for several years. Last year, a House of Lords committee said it was “deeply concerned that its use is being expanded without proper scrutiny and accountability.”
The government also announced yesterday that it would launch a consultation in the autumn designed to help it shape a “new legal framework.”
It added that:
- Use of LFR must follow College of Policing guidelines and comply with the surveillance camera code of practice
- Only trained officers will be deployed to operate LFR vans
- The algorithm used has been independently tested by the National Physical Laboratory (NPL) and found to have no bias for ethnicity, age or gender
- Faces from a live feed will only be checked against police watchlists of “wanted criminals, suspects and those subject to bail or court order conditions” (although it’s unclear how these lists are drawn up)
The government further justified its decision by citing figures claiming the Met Police made 580 arrests using LFR in a 12-month period, for offenses including, rape, domestic abuse, knife crime, grievous bodily harm (GBH) and robbery. It said these included 52 registered sex offenders arrested for breaching their conditions.
“The increased access to LFR vehicles to forces that previously did not have the capability is an excellent opportunity for policing,” argued Lindsey Chiswick, National Police Chiefs Council lead for facial recognition.
“Each LFR deployment will be targeted, intelligence-led, within a set geographical location and for defined period of time, ensuring deployments are proportionate, lawful and necessary.”
The privacy regulator also issued a statement yesterday, reminding police forces that they must abide by existing laws.
“Facial Recognition Technology (FRT) does not operate in a legal vacuum. It is covered by data protection law, which requires any use of personal data, including biometric data, to be lawful, fair and proportionate,” the Information Commissioner’s Office (ICO) said.
“When used by the police, FRT must be deployed in a way that respects people’s rights and freedoms, with appropriate safeguards in place.”
A Facial Recognition Scandal in the Making
However, concerns remain that the authorities cannot be trusted to use the technology responsibly. These were bolstered by Privacy International claims earlier this month that the government has secretly allowed police forces to search over 150 million UK passport and immigration database photos using facial recognition technology for the past six years.
The rights group said Freedom of Information (FOI) enquiries revealed the number of searches of the passport database had shot up from two in 2020 to 417 in 2023, while searches of the immigration database rose from 16 in 2023 to 102 in 2024.
“This secret program is a dangerous infringement on our fundamental rights to privacy and to express ourselves freely, both online and in public,” argued Privacy International senior technologist, Nuno Guerreiro de Sousa.
“It is especially hypocritical that this is happening in a country that prides itself on upholding human rights. This is why we are standing firm in challenging it.”
Big Brother Watch and Privacy International are now taking legal action against the UK government.
In the meantime, the ICO said it would shortly share the findings of a recent audit of South Wales Police and Gwent Police use of facial recognition technology, which will reveal compliance with data protection laws.
Image credit: William Barton / Shutterstock.com
No tags.