UK policing minister Diana Johnston has announced that the current Labor government will hold a series of discussions on police use of live facial recognition (LFR) before the end of the year, inviting regulators and civil society groups.
The potential of live facial recognition to contribute to making streets safer is clear, says Johnston.
“However, some have very legitimate concerns about it, including misidentification misuse and the effect on human rights and individual privacy,” she says. “I’m very pleased that we’ve had the opportunity to start that today.”
The Minister’s announcement came during the first debate on regulating the technology held on Wednesday at Westminster Hall. Members of the Parliament (MPs) argued about well-known issues such as racial bias, accuracy and its impact on rights. But the most important debate may be about the lack of dedicated legislation in the UK on facial recognition.
“In Europe, the controls are very strong,” says Conservative MP John Whittingdale. “In this country, it’s left largely to police officers to interpret the law and to be reasonably confident, but there are legal challenges underway.”
The UK police have been relying on technology to capture criminals at sporting venues and public transportation hubs as well as secure high-profile events such as the Coronation of King Charles III and music concerts. Deployments have also been coming to high streets and shopping centers.
The technology is currently governed by a patchwork of legislation including the Common Law, Police and Criminal Evidence Act (PACE), the UK General Data Protection Regulation and more. In August, digital rights group Big Brother Watch challenged the legality of the London Metropolitan Police’s deployment of facial recognition after a case of misidentification of an anti-knife crime community worker named Shaun Thompson.
According to the Information Commissioner’s Office (ICO), there are no clear guidelines for the police on how live facial recognition should be used. As there is no blanket approval by the ICO, it should essentially be judged on a case-by-case basis, explains Whittingdale.
“I think there is a real need for us to have clarity,” he adds.
The lack of specific legislation for facial recognition technology leaves huge room for misuse and overreach, according to MP and independent politician Iqbal Mohamed.
“As an ex-software test manager, I am extremely concerned that private companies who are profiting from their technology are allowed to self-regulate and to confirm the efficacy of the products that they sell,” says Mohamed.
Police authorities argue that their technology is independently tested.
Live facial recognition checks facial images collected through cameras against a database of people on a watch list compiled from police databases. The UK National Physical Laboratory (NPL) has shown the facial recognition system used by the country’s largest police force, the Met Police, is “very accurate” as long as it’s used at higher face-match thresholds.
Police, however, have the option of lowering face-match thresholds without judicial oversights, according to Labor MP Dawn Butler. It would be easy to imagine a scenario in which a police service would lower the thresholds to get more hits and prove that the system they bought is value for money. Lowering the face-match thresholds, however, leads to more people misidentified with Black subjects having more false positives.
The Met Police used live facial recognition 117 times between January and August 2024, compared with 32 times between 2020 and 2023, according to data compiled by a group of London Assembly members. The police force also has thousands of people on their system who should not be there, adds Butler.
“Live facial recognition changes one of the cornerstones of our democracy, which is an individual is innocent until proven guilty. Now with live facial recognition, if the machine says you’re guilty because we’ve identified you through LFR, then you then have to prove you’re innocent,” she says.
Former Conservative Policing Minister and current Shadow Home Secretary Chris Philp, however, says that nobody is being convicted based on a facial match alone. When the technology was first introduced seven years ago, reports about bias in the algorithms were accurate. Algorithms, however, have developed a great deal since those days, he explains.
While 70 to 85 percent of physical stop and search actions initiated by police officers are not successful, only 0.02 percent of facial recognition checks are unsuccessful.
“This technology is 4500 times less likely to result in someone being inappropriately stopped than a regular stop and search,” says Philp who also serves as MP for Croydon.
The deployment of live facial recognition in Croydon has resulted in approximately 200 arrests of people who would not otherwise have been arrested, including for grievous body harm, fraud, domestic burglary and rape. In order to keep these results, primary legislation may not be the right thing for Britain. Instead, Parliament members should ensure legislation is flexible enough to accommodate changing technology, according to Philp.
“There is an argument or a debate to be had about whether, for the sake of clarity and for the sake of democratic accountability, we in Parliament should set something out more formally,” says Philp. “I think there is some merit in doing that, in clarifying at a national level where these guidelines sit. But I wouldn’t go as far as Europe, because if we did, both rapists would not have been arrested”
Article Topics
biometric identification | biometric matching | biometrics | criminal ID | facial recognition | police | real-time biometrics | UK