Researchers devise an audit tool to test whether police use of facial recognition poses a threat to fundamental human rights, and analyse听three deployments of the technology by British forces 鈥 with all three failing to meet 鈥渕inimum ethical and legal standards鈥.听听

Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police

Gina Neff

A team from the 探花直播 of Cambridge鈥檚 created the new audit tool to evaluate 鈥渃ompliance with the law and national guidance鈥 around issues such as privacy, equality, and freedom of expression and assembly.

Based on the findings, , the experts are joining calls for a ban on police use of facial recognition in public spaces.

鈥淭here is a lack of robust redress mechanisms for individuals and communities harmed by police deployments of the technology,鈥 said the report鈥檚 lead author Evani Radiya-Dixit, a visiting fellow at Cambridge鈥檚 Minderoo Centre.

鈥淭o protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.鈥

Researchers constructed the audit tool based on current legal guidelines 鈥 including the UK鈥檚 Data Protection and Equality acts 鈥 as well as outcomes from UK court cases and feedback from civil society organisations and the Information Commissioner's Office.

They applied their ethical and legal standards to three uses of facial recognition technology (FRT) by UK police. One was the Bridges court case, in which a Cardiff-based civil liberties campaigner appealed against South Wales Police鈥檚 use of automated FRT to live-scan crowds and compare faces to those on a criminal 鈥渨atch list鈥.听听

探花直播researchers also tested the Metropolitan Police鈥檚 trials of similar live FRT use, and a further example from South Wales Police in which officers used FRT apps on their smartphones to scan crowds in order to identify 鈥渨anted individuals in real time鈥.

In all three cases, they found that important information about police use of FRT is 鈥渒ept from view鈥, including scant demographic data published on arrests or other outcomes, making it difficult to evaluate whether the tools 鈥減erpetuate racial profiling鈥 say researchers.

In addition to lack of transparency, the researchers found little in the way of accountability 鈥 with no clear recourse for people or communities negatively affected by police use, or misuse, of the tech. 鈥淧olice forces are not necessarily answerable or held responsible for harms caused by facial recognition technology,鈥 said Radiya-Dixit.

Some of the FRT uses lacked regular oversight from an independent ethics committee or indeed the public, say the researchers, and did not do enough to ensure there was a reliable 鈥渉uman in the loop鈥 when scanning untold numbers of faces among crowds of thousands while hunting for criminals.

In the South Wales Police鈥檚 smartphone app trial, even the 鈥渨atch list鈥 included images of people innocent under UK law 鈥 those previously arrested but not convicted 鈥 despite the fact that retention of such images is unlawful.

鈥淲e find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition,"听said Radiya-Dixit.

Prof Gina Neff, Executive Director at the Minderoo Centre for Technology and Democracy, said: 鈥淥ver the last few years, police forces around the world, including in England and Wales, have deployed facial recognition technologies. Our goal was to assess whether these deployments used known practices for the safe and ethical use of these technologies.鈥澨

鈥淏uilding a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police,鈥 Neff said.

Officers are increasingly under-resourced and overburdened, write the researchers, and FRT is seen as a fast, efficient and cheap way to track down persons of interest.

At least ten police forces in England and Wales have trialled facial recognition, with trials involving FRT use for operational policing purposes 鈥 although different forces use different standards.

Questions of privacy run deep for policing technology that scans and potentially retains vast numbers of facial images without knowledge or consent. 探花直播researchers highlight a possible 鈥渃hilling effect鈥 if FRT leads to a reluctance to exercise fundamental rights among the public听鈥 right to protest, for example 鈥 for fear of potential consequences.

Use of FRT also raises discrimination concerns. 探花直播researchers point out that, historically, surveillance systems are used to monitor marginalised groups, and recent studies suggest the technology itself contains inherent bias that disproportionately misidentifies women, people of colour, and people with disabilities.

Given regulatory gaps and failures to meet minimum standards set out by the new audit toolkit, the researchers write that they support calls for a 鈥渂an on police use of facial recognition in publicly accessible spaces鈥.



探花直播text in this work is licensed under a . Images, including our videos, are Copyright 漏 探花直播 of Cambridge and licensors/contributors as identified.听 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.