ֱ̽ of Cambridge - Gina Neff /taxonomy/people/gina-neff en Opinion: Whether democracy can survive AI will depend on us /stories/Gina-Neff-AI-democracy <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>What is the best framework for the global governance of AI? How do we respond to tech companies who argue against regulation? Is our current pace of technological change ultimately greater than our ability to manage it?</p> </p></div></div></div> Mon, 14 Apr 2025 16:13:42 +0000 lw355 249322 at Forcing UK creatives to ‘opt out’ of AI training risks stifling new talent, Cambridge experts warn /research/news/forcing-uk-creatives-to-opt-out-of-ai-training-risks-stifling-new-talent-cambridge-experts-warn <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/kyle-loftus-3ucqtxsva88-unsplash-copy.jpg?itok=uG3F4ETE" alt="Videographer in studio with a model" title="Credit: Kal Visuals - Unsplash" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽UK government should resist allowing AI companies to scrape all copyrighted works unless the holder has actively ‘opted out’, as it puts an unfair burden on up-and-coming creative talents who lack the skills and resources to meet legal requirements.</p> <p><a href="https://www.mctd.ac.uk/policy-brief-ai-copyright-productivity-uk-creative-industries/">This is according to a new report</a> from ֱ̽ of Cambridge experts in economics, policy and machine learning, who also argue the UK government should clearly state that only a human author can hold copyright – even when AI has been heavily involved.</p> <p>A collaboration between three Cambridge initiatives – the Minderoo Centre for Technology and Democracy, the Bennett Institute for Public Policy, and ai@cam – the report argues that unregulated use of generative AI will not guarantee economic growth, and risks damaging the UK’s thriving creative sector. </p> <p>If the UK adopts the <a href="https://www.gov.uk/government/consultations/copyright-and-artificial-intelligence/copyright-and-artificial-intelligence#c-our-proposed-approach">proposed ‘rights reservation’ for AI data mining</a>, rather than maintaining the legal foundation that automatically safeguards copyright, it will compromise the livelihoods of many in the sector, particularly those just starting out, say researchers.</p> <p>They argue that it risks allowing artistic content produced in the UK to be scraped for endless reuse by offshore companies.</p> <p>“Going the way of an opt-out model is telling Britain’s artists, musicians, and writers that tech industry profitability is more valuable than their creations,” said Prof Gina Neff, Executive Director at the Minderoo Centre for Technology and Democracy.</p> <p>“Ambitions to strengthen the creative sector, bolster the British economy and spark innovation using GenAI in the UK can be achieved – but we will only get results that benefit all of us if we put people’s needs before tech companies.”</p> <p><strong>'Ingested' by technologies</strong></p> <p>Creative industries contribute around £124.6 billion or 5.7% to the UK’s economy, and have a deep connection to the tech industry. For example, the UK video games industry is the largest in Europe, and contributed £5.12 billion to the UK economy in 2019.</p> <p>While AI could lead to a new generation of creative companies and products, the researchers say that little is currently known about how AI is being adopted within these industries, and where the skills gaps lie.</p> <p>“ ֱ̽Government ought to commission research that engages directly with creatives, understanding where and how AI is benefiting and harming them, and use it to inform policies for supporting the sector’s workforce,” said Neil Lawrence, DeepMind Professor of Machine Learning and Chair of ai@cam.</p> <p>“Uncertainty about copyright infringement is hindering the development of Generative AI for public benefit in the UK. For AI to be trusted and widely deployed, it should not make creative work more difficult.”</p> <p>In the UK, copyright is vested in the creator automatically if it meets the legal criteria. Some AI companies have tried to exploit ‘fair dealing’ – a loophole based around use for research or reporting – but this is undermined by the commercial nature of most AI.</p> <p>Now, some AI companies are brokering licensing agreements with publishers, and the report argues this is a potential way to ensure creative industries are compensated.</p> <p>While rights of performers, from singers to actors, currently cover reproductions of live performances, AI uses composites harvested from across a performer’s oeuvre, so rights relating to specific performances are unlikely to apply, say researchers.</p> <p>Further clauses in older contracts mean performers are having their work ‘ingested’ by technologies that didn’t exist when they signed on the dotted line.</p> <p> ֱ̽researchers call on the government to fully adopt the Beijing Treaty on Audio Visual Performance, which the UK signed over a decade ago but is yet to implement, as it gives performers economic rights over all reproduction, distribution and rental.</p> <p>" ֱ̽current lack of clarity about the licensing and regulation of training data use is a lose-lose situation. Creative professionals aren't fairly compensated for their work being used to train AI models, while AI companies are hesitant to fully invest in the UK due to unclear legal frameworks,” said Prof Diane Coyle, the Bennett Professor of Public Policy.</p> <p>“We propose mandatory transparency requirements for AI training data and standardised licensing agreements that properly value creative works. Without these guardrails, we risk undermining our valuable creative sector in the pursuit of uncertain benefits from AI."</p> <p><strong>'Spirit of copyright law'</strong></p> <p> ֱ̽Cambridge experts also look at questions of copyright for AI-generated work, and the extent to which ‘prompting’ AI can constitute ownership. They conclude that AI cannot itself hold copyright, and the UK government should develop guidelines on compensation for artists whose work and name feature in prompts instructing AI.</p> <p>When it comes to the proposed ‘opt-out’ solution, the experts it is not “in the spirit of copyright law” and is difficult to enforce. Even if creators do opt out, it is not clear how that data will be identified, labelled, and compensated, or even erased.</p> <p>It may be seen as giving ‘carte blanche’ to foreign-owned and managed AI companies to benefit from British copyrighted works without a clear mechanism for creators to receive fair compensation.</p> <p>“Asking copyright reform to solve structural problems with AI is not the solution,” said Dr Ann Kristin Glenster, Senior Policy Advisor at the Minderoo Centre for Technology and lead author of the report.</p> <p>“Our research shows that the business case has yet to be made for an opt-out regime that will promote growth and innovation of the UK creative industries.</p> <p>“Devising policies that enable the UK creative industries to benefit from AI should be the Government’s priority if it wants to see growth of both its creative and tech industries,” Glenster said.</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽UK government’s proposed ‘rights reservation’ model for AI data mining tells British artists, musicians, and writers that “tech industry profitability is more valuable than their creations” say leading academics.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We will only get results that benefit all of us if we put people’s needs before tech companies</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Gina Neff</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://unsplash.com/photos/man-in-green-and-brown-camouflage-jacket-holding-black-video-camera-3UcQtXSvA88" target="_blank">Kal Visuals - Unsplash</a></div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution">Attribution</a></div></div></div> Thu, 20 Feb 2025 07:56:32 +0000 fpjl2 248711 at UK needs AI legislation to create trust so companies can ‘plug AI into British economy’ /research/news/uk-needs-ai-legislation-to-create-trust-so-companies-can-plug-ai-into-british-economy-report <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/ai-minderoopic.jpg?itok=KzyzmE0S" alt="Data Tunnel" title="Data Tunnel, Credit: Getty/BlackJack3D" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽British government should offer tax breaks for businesses developing AI-powered products and services, or applying AI to their existing operations, to 'unlock the UK’s potential for augmented productivity', according to a <a href="https://www.mctd.ac.uk/which-path-should-the-uk-take-to-build-national-capability-for-generative-ai/">new ֱ̽ of Cambridge report</a>.</p>&#13; &#13; <p>Researchers argue that the UK currently lacks the computing capacity and capital required to build 'generative' machine learning models fast enough to compete with US companies such as Google, Microsoft or Open AI.</p>&#13; &#13; <p>Instead, they call for a UK focus on leveraging these new AI systems for real-world applications – such as developing new diagnostic products and addressing the shortage of software engineers – which could provide a major boost to the British economy.</p>&#13; &#13; <p>However, the researchers caution that without new legislation to ensure the UK has solid legal and ethical AI regulation, such plans could falter. British industries and the public may struggle to trust emerging AI platforms such as ChatGPT enough to invest time and money into skilling up.</p>&#13; &#13; <p> ֱ̽policy report is a collaboration between Cambridge’s <a href="https://www.mctd.ac.uk/">Minderoo Centre for Technology and Democracy</a>, <a href="https://www.bennettinstitute.cam.ac.uk/">Bennett Institute for Public Policy</a>, and <a href="https://ai.cam.ac.uk/">ai@cam</a>: the ֱ̽’s flagship initiative on artificial intelligence.</p>&#13; &#13; <p>“Generative AI will change the nature of how things are produced, just as what occurred with factory assembly lines in the 1910s or globalised supply chains at the turn of the millennium,” said Dame Diane Coyle, Bennett Professor of Public Policy. “ ֱ̽UK can become a global leader in actually plugging these AI technologies into the economy.”</p>&#13; &#13; <p>Prof Gina Neff, Executive Director of the Minderoo Centre for Technology and Democracy, said: “A new Bill that fosters confidence in AI by legislating for data protection, intellectual property and product safety is vital groundwork for using this technology to increase UK productivity.”</p>&#13; &#13; <p>Generative AI uses algorithms trained on giant datasets to output original high-quality text, images, audio, or video at ferocious speed and scale. ֱ̽text-based ChatGPT dominated headlines this year. Other examples include Midjourney, which can conjure imagery in any different style in seconds.</p>&#13; &#13; <p>Networked grids – or clusters – of computing hardware called Graphics Processing Units (GPU) are required to handle the vast quantities of data that hone these machine-learning models. For example, ChatGPT is estimated to cost $40 million a month in computing alone. In the spring of this year, the UK chancellor announced £100 million for a “Frontier AI Taskforce” to scope out the creation of home-grown AI to rival the likes of Google Bard.</p>&#13; &#13; <p>However, the report points out that the supercomputer announced by the UK chancellor is unlikely to be online until 2026, while none of the big three US tech companies – Amazon, Microsoft or Google – have GPU clusters in the UK.</p>&#13; &#13; <p>“ ֱ̽UK has no companies big enough to invest meaningfully in foundation model development,” said report co-author Sam Gilbert. “State spending on technology is modest compared to China and the US, as we have seen in the UK chip industry.”</p>&#13; &#13; <p>As such, the UK should use its strengths in fin-tech, cybersecurity and health-tech to build software – the apps, tools and interfaces – that harnesses AI for everyday use, says the report.</p>&#13; &#13; <p>“Generative AI has been shown to speed up coding by some 55%, which could help with the UK’s chronic developer shortage,” said Gilbert. “In fact, this type of AI can even help non-programmers to build sophisticated software.”</p>&#13; &#13; <p>Moreover, the UK has world-class research universities that could drive progress in tackling AI stumbling blocks: from the cooling of data centres to the detection of AI-generated misinformation.</p>&#13; &#13; <p>At the moment, however, UK organisations lack incentives to comply with responsible AI. “ ֱ̽UK’s current approach to regulating generative AI is based on a set of vague and voluntary principles that nod at security and transparency,” said report co-author Dr Ann Kristin Glenster.</p>&#13; &#13; <p>“ ֱ̽UK will only be able to realise the economic benefits of AI if the technology can be trusted, and that can only be ensured through meaningful legislation and regulation.”</p>&#13; &#13; <p>Along with new AI laws, the report suggests a series of tax incentives, such as an enhanced Seed Enterprise Investment Scheme, to increase the supply of capital to AI start-ups, as well as tax credits for all businesses including generative AI in their operations. Challenge prizes could be launched to identify bottom-up uses of generative AI from within organisations.</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Legislating for AI safety and transparency will allow British industry and education to put resources into AI development with confidence, argue researchers.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"> ֱ̽UK can become a global leader in actually plugging these AI technologies into the economy</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Diane Coyle </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Getty/BlackJack3D</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Data Tunnel</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 16 Oct 2023 06:20:05 +0000 fpjl2 242671 at Cambridge academics join £31 million consortium to develop trustworthy and secure AI /research/news/cambridge-academics-join-ps31-million-consortium-to-develop-trustworthy-and-secure-ai <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/artificial-intelligence-gbee02400c-1280-web.jpg?itok=5QZl1Yxf" alt="Graphic showing the letters A and I in the centre of computer networks" title="Artificial intelligence, Credit: geralt (Pixabay)" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽announcement was made by UK Research and Innovation (UKRI), which, as part of London Tech Week, today unveiled a suite of AI investments that will bring academic and industry partners together.</p>&#13; &#13; <p>Identified by the Government as a critical technology as set out in the UK Science and Technology Framework, AI is a rapidly developing science and technology area with massive potential benefits to the economy and society.</p>&#13; &#13; <p>As part of the announcement package, £31 million has been awarded to <a href="https://rai.ac.uk/">Responsible AI UK</a>, a large consortium led by the ֱ̽ of Southampton, that aims to create a UK and international research and innovation ecosystem for responsible and trustworthy AI that will be responsive to the needs of society.</p>&#13; &#13; <p>Led by Professor Gopal Ramchurn at Southampton, the consortium will pioneer a reflective, inclusive approach to responsible AI development, working across universities, businesses, public and third sectors and the general public. It will fund multi-disciplinary research that helps us understand what responsible and trustworthy AI is, how to develop it and build it into existing systems, and the impacts it will have on society.</p>&#13; &#13; <p>Gina Neff, Executive Director of the Minderoo Centre for Technology and Democracy, ֱ̽ of Cambridge, will direct the strategy group for RAI UK.</p>&#13; &#13; <p>Speaking about the launch, Neff said: “I am delighted to be a part of RAI UK. We will work to link Britain’s world-leading responsible AI ecosystem and lead a national conversation around AI, to ensure that responsible and trustworthy AI can power benefits for everyone.”</p>&#13; &#13; <p> ֱ̽consortium will convene national conversations on responsible AI and help bring coherence to the AI ecosystem across the whole of the UK. It will work closely with policymakers to provide evidence for future policy and regulation, as well as guidance for businesses in deploying AI solutions responsibly.</p>&#13; &#13; <p> ֱ̽consortium’s activities will encompass large-scale research programmes, collaborations between academics and businesses, skills programmes for the public and industry, and the publication of white papers outlining approaches for the UK and global AI landscape.</p>&#13; &#13; <p><em>Adapted from a news story by the <a href="https://www.mctd.ac.uk/">Minderoo Centre for Technology and Democracy</a></em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers from Cambridge’s Minderoo Centre for Technology and Democracy are part of a £31 million consortium to create a UK and international research and innovation ecosystem for responsible and trustworthy AI.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We will work to link Britain’s world-leading responsible AI ecosystem and lead a national conversation around AI, to ensure that responsible and trustworthy AI can power benefits for everyone</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Gina Neff</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://pixabay.com/illustrations/artificial-intelligence-ai-6767502/" target="_blank">geralt (Pixabay)</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Artificial intelligence</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/public-domain">Public Domain</a></div></div></div> Wed, 14 Jun 2023 09:40:26 +0000 cjb250 239901 at UK police fail to meet 'legal and ethical standards' in use of facial recognition /research/news/uk-police-fail-to-meet-legal-and-ethical-standards-in-use-of-facial-recognition <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/minderoo.jpg?itok=bhJ0zBmS" alt="" title="Image from the report &amp;#039;A Sociotechnical Audit: Assessing Police use of Facial Recognition&amp;#039;, Credit: Minderoo Centre for Technology and Democracy " /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>A team from the ֱ̽ of Cambridge’s <a href="https://www.mctd.ac.uk/">Minderoo Centre for Technology and Democracy</a> created the new audit tool to evaluate “compliance with the law and national guidance” around issues such as privacy, equality, and freedom of expression and assembly.</p> <p>Based on the findings, <a href="https://www.mctd.ac.uk/a-sociotechnical-audit-assessing-police-use-of-facial-recognition/">published in a new report</a>, the experts are joining calls for a ban on police use of facial recognition in public spaces.</p> <p>“There is a lack of robust redress mechanisms for individuals and communities harmed by police deployments of the technology,” said the report’s lead author Evani Radiya-Dixit, a visiting fellow at Cambridge’s Minderoo Centre.</p> <p>“To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.”</p> <p>Researchers constructed the audit tool based on current legal guidelines – including the UK’s Data Protection and Equality acts – as well as outcomes from UK court cases and feedback from civil society organisations and the Information Commissioner's Office.</p> <p>They applied their ethical and legal standards to three uses of facial recognition technology (FRT) by UK police. One was the Bridges court case, in which a Cardiff-based civil liberties campaigner appealed against South Wales Police’s use of automated FRT to live-scan crowds and compare faces to those on a criminal “watch list”.  </p> <p> ֱ̽researchers also tested the Metropolitan Police’s trials of similar live FRT use, and a further example from South Wales Police in which officers used FRT apps on their smartphones to scan crowds in order to identify “wanted individuals in real time”.</p> <p>In all three cases, they found that important information about police use of FRT is “kept from view”, including scant demographic data published on arrests or other outcomes, making it difficult to evaluate whether the tools “perpetuate racial profiling” say researchers.</p> <p>In addition to lack of transparency, the researchers found little in the way of accountability – with no clear recourse for people or communities negatively affected by police use, or misuse, of the tech. “Police forces are not necessarily answerable or held responsible for harms caused by facial recognition technology,” said Radiya-Dixit.</p> <p>Some of the FRT uses lacked regular oversight from an independent ethics committee or indeed the public, say the researchers, and did not do enough to ensure there was a reliable “human in the loop” when scanning untold numbers of faces among crowds of thousands while hunting for criminals.</p> <p>In the South Wales Police’s smartphone app trial, even the “watch list” included images of people innocent under UK law – those previously arrested but not convicted – despite the fact that retention of such images is unlawful.</p> <p>“We find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition," said Radiya-Dixit.</p> <p>Prof Gina Neff, Executive Director at the Minderoo Centre for Technology and Democracy, said: “Over the last few years, police forces around the world, including in England and Wales, have deployed facial recognition technologies. Our goal was to assess whether these deployments used known practices for the safe and ethical use of these technologies.” </p> <p>“Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police,” Neff said.</p> <p>Officers are increasingly under-resourced and overburdened, write the researchers, and FRT is seen as a fast, efficient and cheap way to track down persons of interest.</p> <p>At least ten police forces in England and Wales have trialled facial recognition, with trials involving FRT use for operational policing purposes – although different forces use different standards.</p> <p>Questions of privacy run deep for policing technology that scans and potentially retains vast numbers of facial images without knowledge or consent. ֱ̽researchers highlight a possible “chilling effect” if FRT leads to a reluctance to exercise fundamental rights among the public – right to protest, for example – for fear of potential consequences.</p> <p>Use of FRT also raises discrimination concerns. ֱ̽researchers point out that, historically, surveillance systems are used to monitor marginalised groups, and recent studies suggest the technology itself contains inherent bias that disproportionately misidentifies women, people of colour, and people with disabilities.</p> <p>Given regulatory gaps and failures to meet minimum standards set out by the new audit toolkit, the researchers write that they support calls for a “ban on police use of facial recognition in publicly accessible spaces”.</p> <p> </p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers devise an audit tool to test whether police use of facial recognition poses a threat to fundamental human rights, and analyse three deployments of the technology by British forces – with all three failing to meet “minimum ethical and legal standards”.  </p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Gina Neff</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Minderoo Centre for Technology and Democracy </a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Image from the report &#039;A Sociotechnical Audit: Assessing Police use of Facial Recognition&#039;</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 27 Oct 2022 15:13:25 +0000 fpjl2 234991 at