ֱ̽ of Cambridge - privacy /taxonomy/subjects/privacy en What is the metaverse – and will it help us or harm us? /stories/metaverse <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>An interconnected world of extended reality is coming that will reshape how we work, play and communicate – and expose us to new levels of risk. What is the metaverse? Will we be safe? How do we make the most of it?</p> </p></div></div></div> Thu, 27 Jul 2023 07:52:28 +0000 lw355 241051 at UK police fail to meet 'legal and ethical standards' in use of facial recognition /research/news/uk-police-fail-to-meet-legal-and-ethical-standards-in-use-of-facial-recognition <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/minderoo.jpg?itok=bhJ0zBmS" alt="" title="Image from the report &amp;#039;A Sociotechnical Audit: Assessing Police use of Facial Recognition&amp;#039;, Credit: Minderoo Centre for Technology and Democracy " /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>A team from the ֱ̽ of Cambridge’s <a href="https://www.mctd.ac.uk/">Minderoo Centre for Technology and Democracy</a> created the new audit tool to evaluate “compliance with the law and national guidance” around issues such as privacy, equality, and freedom of expression and assembly.</p> <p>Based on the findings, <a href="https://www.mctd.ac.uk/a-sociotechnical-audit-assessing-police-use-of-facial-recognition/">published in a new report</a>, the experts are joining calls for a ban on police use of facial recognition in public spaces.</p> <p>“There is a lack of robust redress mechanisms for individuals and communities harmed by police deployments of the technology,” said the report’s lead author Evani Radiya-Dixit, a visiting fellow at Cambridge’s Minderoo Centre.</p> <p>“To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.”</p> <p>Researchers constructed the audit tool based on current legal guidelines – including the UK’s Data Protection and Equality acts – as well as outcomes from UK court cases and feedback from civil society organisations and the Information Commissioner's Office.</p> <p>They applied their ethical and legal standards to three uses of facial recognition technology (FRT) by UK police. One was the Bridges court case, in which a Cardiff-based civil liberties campaigner appealed against South Wales Police’s use of automated FRT to live-scan crowds and compare faces to those on a criminal “watch list”.  </p> <p> ֱ̽researchers also tested the Metropolitan Police’s trials of similar live FRT use, and a further example from South Wales Police in which officers used FRT apps on their smartphones to scan crowds in order to identify “wanted individuals in real time”.</p> <p>In all three cases, they found that important information about police use of FRT is “kept from view”, including scant demographic data published on arrests or other outcomes, making it difficult to evaluate whether the tools “perpetuate racial profiling” say researchers.</p> <p>In addition to lack of transparency, the researchers found little in the way of accountability – with no clear recourse for people or communities negatively affected by police use, or misuse, of the tech. “Police forces are not necessarily answerable or held responsible for harms caused by facial recognition technology,” said Radiya-Dixit.</p> <p>Some of the FRT uses lacked regular oversight from an independent ethics committee or indeed the public, say the researchers, and did not do enough to ensure there was a reliable “human in the loop” when scanning untold numbers of faces among crowds of thousands while hunting for criminals.</p> <p>In the South Wales Police’s smartphone app trial, even the “watch list” included images of people innocent under UK law – those previously arrested but not convicted – despite the fact that retention of such images is unlawful.</p> <p>“We find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition," said Radiya-Dixit.</p> <p>Prof Gina Neff, Executive Director at the Minderoo Centre for Technology and Democracy, said: “Over the last few years, police forces around the world, including in England and Wales, have deployed facial recognition technologies. Our goal was to assess whether these deployments used known practices for the safe and ethical use of these technologies.” </p> <p>“Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police,” Neff said.</p> <p>Officers are increasingly under-resourced and overburdened, write the researchers, and FRT is seen as a fast, efficient and cheap way to track down persons of interest.</p> <p>At least ten police forces in England and Wales have trialled facial recognition, with trials involving FRT use for operational policing purposes – although different forces use different standards.</p> <p>Questions of privacy run deep for policing technology that scans and potentially retains vast numbers of facial images without knowledge or consent. ֱ̽researchers highlight a possible “chilling effect” if FRT leads to a reluctance to exercise fundamental rights among the public – right to protest, for example – for fear of potential consequences.</p> <p>Use of FRT also raises discrimination concerns. ֱ̽researchers point out that, historically, surveillance systems are used to monitor marginalised groups, and recent studies suggest the technology itself contains inherent bias that disproportionately misidentifies women, people of colour, and people with disabilities.</p> <p>Given regulatory gaps and failures to meet minimum standards set out by the new audit toolkit, the researchers write that they support calls for a “ban on police use of facial recognition in publicly accessible spaces”.</p> <p> </p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers devise an audit tool to test whether police use of facial recognition poses a threat to fundamental human rights, and analyse three deployments of the technology by British forces – with all three failing to meet “minimum ethical and legal standards”.  </p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Gina Neff</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Minderoo Centre for Technology and Democracy </a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Image from the report &#039;A Sociotechnical Audit: Assessing Police use of Facial Recognition&#039;</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 27 Oct 2022 15:13:25 +0000 fpjl2 234991 at ‘Digital mask’ could protect patients’ privacy in medical records /stories/digital-masks <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Scientists have created a ‘digital mask’ that will allow facial images to be stored in medical records while preventing potentially sensitive personal biometric information from being extracted and shared.</p> </p></div></div></div> Thu, 15 Sep 2022 15:00:51 +0000 cjb250 234101 at ֱ̽Internet of Stings: research will probe privacy and legal concerns of smart devices /research/news/the-internet-of-stings-research-will-probe-privacy-and-legal-concerns-of-smart-devices <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/howard-bouchevereau-876c-f8ybrg-unsplash.jpg?itok=EFEzMIsE" alt="Smart speaker" title="Smart speaker, Credit: Howard Bouchevereau via Unsplash" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>These questions have been worrying researchers at the ֱ̽ of Cambridge Department of Computer Science and Technology. Now they are launching a year-long investigation into the ways our information is being collected and whether or not these always comply with regulations and the law.</p>&#13; &#13; <p>Working in collaboration with colleagues at Imperial College London, they will probe the data that flows from the Internet of Things – the networked consumer devices, such as smart printers, doorbells and toys, that are an increasing presence in our homes.</p>&#13; &#13; <p>Backed by a grant from the Information Commissioner’s Office, the UK’s data protection regulator, they will be investigating what <a href="https://www.cst.cam.ac.uk/people/js573">Dr Jat Singh</a> describes as ‘the Internet of Stings’.</p>&#13; &#13; <p>Research shows that information from our devices often finds its way to a range of third parties, such as user-tracking and advertising networks that may mine it for valuable information about consumer behaviour. He’s also worried about the occasions when data is transmitted from one country to another where there may be different rules, rights and restrictions around data and its use.</p>&#13; &#13; <p>So Singh and the research team want to investigate the transmission of data from our devices to find out if it is in line with relevant law – and to inform consumers about the potential of what we can do to have better control over our information.</p>&#13; &#13; <p>“We see ‘smart’ devices increasingly being worn on people's bodies and used in people's homes,” said Singh. “However, it’s often unclear what happens with the data these devices collect: where that data goes and how it is used. This is concerning, given these devices can often collect highly personal, private and sensitive information about ourselves and our lives.</p>&#13; &#13; <p>“This project seeks to shed light on the state of current commercial data practices by analysing the nature of data flows from both a technical and a data rights, perspective. We aim to show if there are any data protection implications and concerns in the consumer smart device landscape so that we can empower policymakers, regulators, and individuals alike.”<br /><br />&#13; Dr Singh leads the <a href="https://www.compacctsys.net/">Compliant &amp; Accountable Systems Research group</a>, a team of researchers working at the intersection of technology and law. They consider ways in which technology could be better designed and deployed to meet legal and regulatory concerns and work to inform policymakers and regulators about the technical realities of new and emerging technologies.</p>&#13; &#13; <p>Technical network-monitoring mechanisms have been used to establish the ways in which data is transmitted, the patterns of transmissions, and the destinations it ended up in. “This showed that potentially problematic data-flow appears to be rife in the Internet of Things,” said Singh.</p>&#13; &#13; <p>Over the next year, they’ll be taking a detailed look at whether devices actually transmit data in accordance with the privacy policies and other legal obligations of the companies that sell them.</p>&#13; &#13; <p>They will also explore the implications of mitigations that consumers might use, such as the implications of blocking particular data flows.</p>&#13; &#13; <p>They want to establish the nature and scale of any problems and see if vendor companies are being honest and fully transparent with their consumers and compliant with data protection and other laws. They also want to better inform not only device owners but also regulators and policy-makers about the suspected issues, which may help inform future interventions.</p>&#13; &#13; <p>“Problems with the data practices of the consumer smart devices have been suspected for some time, but not fully examined – from both a technical and legal perspective,” said Singh. “We need to do so if we want a better, fairer and more compliant Internet of Things.”</p>&#13; &#13; <p> </p>&#13; &#13; <p>Originally published on the <cite><a href="https://www.cst.cam.ac.uk/news/internet-stings">Computer Science and Technology website</a></cite>.</p>&#13; &#13; <p> </p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>What happens to all the sensitive personal information our smart devices collect from us? Where does the data picked up by our smart watches, speakers and TVs go, who has access to it and how is it used?</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">It’s often unclear what happens with the data these devices collect: where that data goes and how it is used. This is concerning, given these devices can often collect highly personal, private and sensitive information about ourselves and our lives</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Jat Singh</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://unsplash.com/photos/black-apple-homepod-speaker-on-table-876c-F8YBrg" target="_blank">Howard Bouchevereau via Unsplash</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Smart speaker</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 22 Oct 2021 08:43:11 +0000 sc604 227671 at In tech we trust? /research/features/in-tech-we-trust <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/david-werbrouck-304966-unsplash_0.jpg?itok=7L-Q6nEB" alt="" title="Credit: Daniel Werbrouck" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Dr Jat Singh is familiar with breaking new ground and working across disciplines. Even so, he and colleagues were pleasantly surprised by how much enthusiasm has greeted their new <a href="https://www.trusttech.cam.ac.uk/">Strategic Research Initiative on Trustworthy Technologies</a>, which brings together science, technology and humanities researchers from across the ֱ̽.</p>&#13; &#13; <p>In fact, Singh, a researcher in Cambridge’s Department of Computer Science and Technology, has been collaborating with lawyers for several years: “A legal perspective is paramount when you’re researching the technical dimensions to compliance, accountability and trust in emerging ICT; although the Computer Lab is not the usual home for lawyers, we have two joining soon.”</p>&#13; &#13; <p>Governance and public trust present some of the greatest challenges in technology today. ֱ̽European General Data Protection Regulation (GDPR), which comes into force this year, has brought forward debates such as whether individuals have a ‘right to an explanation’ regarding decisions made by machines, and introduces stiff penalties for breaching data protection rules. “With penalties including fines of up to 4% of global turnover or €20 million, people are realising that they need to take data protection much more seriously,” he says.</p>&#13; &#13; <p>Singh is particularly interested in how data-driven systems and algorithms – including machine learning – will soon underpin and automate everything from transport networks to council services.</p>&#13; &#13; <p>As we work, shop and travel, computers and mobile phones already collect, transmit and process much data about us; as the ‘Internet of Things’ continues to instrument the physical world, machines will increasingly mediate and influence our lives.</p>&#13; &#13; <p>It’s a future that raises profound issues of privacy, security, safety and ultimately trust, says Singh, whose research is funded by an Engineering and Physical Sciences Research Council Fellowship: “We work on mechanisms for better transparency, control and agency in systems, so that, for instance, if I give data to someone or something, there are means for ensuring they’re doing the right things with it. We are also active in policy discussions to help better align the worlds of technology and law.”</p>&#13; &#13; <p>What it means to trust machine learning systems also concerns Dr Adrian Weller. Before becoming a senior research fellow in the Department of Engineering and a Turing Fellow at ֱ̽Alan Turing Institute, he spent many years working in trading for leading investment banks and hedge funds, and has seen first-hand how machine learning is changing the way we live and work.</p>&#13; &#13; <p>“Not long ago, many markets were traded on exchanges by people in pits screaming and yelling,” Weller recalls. “Today, most market making and order matching is handled by computers. Automated algorithms can typically provide tighter, more responsive markets – and liquid markets are good for society.”</p>&#13; &#13; <p>But cutting humans out of the loop can have unintended consequences, as the flash crash of 2010 shows. During 36 minutes on 6 May, nearly one trillion dollars were wiped off US stock markets as an unusually large sell order produced an emergent coordinated response from automated algorithms. “ ֱ̽flash crash was an important example illustrating that over time, as we have more AI agents operating in the real world, they may interact in ways that are hard to predict,” he says.</p>&#13; &#13; <p><a href="/system/files/issue_35_research_horizons_new.pdf"><img alt="" src="/sites/www.cam.ac.uk/files/inner-images/front-cover_for-web.jpg" style="width: 288px; height: 407px; float: right;" /></a>Algorithms are also beginning to be involved in critical decisions about our lives and liberty. In medicine, machine learning is helping diagnose diseases such as cancer and diabetic retinopathy; in US courts, algorithms are used to inform decisions about bail, sentencing and parole; and on social media and the web, our personal data and browsing history shape the news stories and advertisements we see.</p>&#13; &#13; <p>How much we trust the ‘black box’ of machine learning systems, both as individuals and society, is clearly important. “There are settings, such as criminal justice, where we need to be able to ask why a system arrived at its conclusion – to check that appropriate process was followed, and to enable meaningful challenge,” says Weller. “Equally, to have effective real-world deployment of algorithmic systems, people will have to trust them.”</p>&#13; &#13; <p>But even if we can lift the lid on these black boxes, how do we interpret what’s going on inside? “There are many kinds of transparency,” he explains. “A user contesting a decision needs a different kind of transparency to a developer who wants to debug a system. And a third form of transparency might be needed to ensure a system is accountable if something goes wrong, for example an accident involving a driverless car.”</p>&#13; &#13; <p>If we can make them trustworthy and transparent, how can we ensure that algorithms do not discriminate unfairly against particular groups? While it might be useful for Google to advertise products it ‘thinks’ we are most likely to buy, it is more disquieting to discover the assumptions it makes based on our name or postcode.</p>&#13; &#13; <p>When Latanya Sweeney, Professor of Government and Technology in Residence at Harvard ֱ̽, tried to track down one of her academic papers by Googling her name, she was shocked to be presented with ads suggesting that she had been arrested. After much research, she discovered that “black-sounding” names were 25% more likely to result in the delivery of this kind of advertising.</p>&#13; &#13; <p>Like Sweeney, Weller is both disturbed and intrigued by examples of machine-learned discrimination. “It’s a worry,” he acknowledges. “And people sometimes stop there – they assume it’s a case of garbage in, garbage out, end of story. In fact, it’s just the beginning, because we’re developing techniques that can automatically detect and remove some forms of bias.”</p>&#13; &#13; <p>Transparency, reliability and trustworthiness are at the core of Weller’s work at the Leverhulme Centre for the Future of Intelligence and ֱ̽Alan Turing Institute. His project grapples with how to make machine-learning decisions interpretable, develop new ways to ensure that AI systems perform well in real-world settings, and examine whether empathy is possible – or desirable – in AI.</p>&#13; &#13; <p>Machine learning systems are here to stay. Whether they are a force for good rather than a source of division and discrimination depends partly on researchers such as Singh and Weller. ֱ̽stakes are high, but so are the opportunities. Universities have a vital role to play, both as critic and conscience of society. Academics can help society imagine what lies ahead and decide what we want from machine learning – and what it would be wise to guard against.</p>&#13; &#13; <p>Weller believes the future of work is a huge issue: “Many jobs will be substantially altered if not replaced by machines in coming decades. We need to think about how to deal with these big changes.”And academics must keep talking as well as thinking. “We’re grappling with pressing and important issues,” he concludes. “As technical experts we need to engage with society and talk about what we’re doing so that policy makers can try to work towards policy that’s technically and legally sensible.”</p>&#13; &#13; <div><em>Inset image: read more about our AI research in the ֱ̽'s research magazine; download a <a href="/system/files/issue_35_research_horizons_new.pdf">pdf</a>; view on <a href="https://issuu.com/uni_cambridge/docs/issue_35_research_horizons">Issuu</a>.</em></div>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Fairness, trust and transparency are qualities we usually associate with organisations or individuals. Today, these attributes might also apply to algorithms. As machine learning systems become more complex and pervasive, Cambridge researchers believe it’s time for new thinking about new technology.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">With penalties including fines of up to €20 million, people are realising that they need to take data protection much more seriously</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Jat Singh</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://unsplash.com/photos/grayscale-photo-of-person-running-in-panel-paintings-5GwLlb-_UYk" target="_blank">Daniel Werbrouck</a></div></div></div><div class="field field-name-field-panel-title field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Want to hear more? </div></div></div><div class="field field-name-field-panel-body field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p>Join us at the Cambridge Science Festival to hear Adrian Weller discuss how we can ensure AI systems are transparent, reliable and trustworthy. </p>&#13; &#13; <p>Thursday 15 March 2018, 7:30pm - 8:30pm</p>&#13; &#13; <p>Mill Lane Lecture Rooms, 8 Mill Lane, Cambridge, UK, CB2 1RW</p>&#13; &#13; <p><a href="https://www.festival.cam.ac.uk/events/trust-and-transparency-ai-systems">BOOK HERE</a></p>&#13; </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width: 0px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 23 Feb 2018 09:30:00 +0000 lw355 195572 at Protecting our data and identity: how should the law respond? /research/features/protecting-our-data-and-identity-how-should-the-law-respond <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/161026what-are-you-looking-atnolifebeforecoffee.jpg?itok=q1gEXxLF" alt="" title="Banksy stencil, Credit: nolifebeforecoffee" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽freedom of expression and the need for privacy may be strange bedfellows today – but could full-blown estrangement beckon in a digital future that makes the leap from user-controlled content to unfiltered, online sharing of, well, everything?</p> <p>A future where streaming your life online becomes the norm is not unthinkable, according to Dr David Erdos, whose research in the Faculty of Law explores the nature of data protection. “Take something like Snapchat Spectacles or Google Glass,” he says. “Such technology could very quickly take off, and all of a sudden it becomes ‘normal’ that everyone is recording everything, both audibly and visually, and the data is going everywhere and being used for all sorts of purposes – some individual, some organisational.”</p> <p>This makes questions about what control we have over our digital footprint rather urgent.</p> <p>“You can see that we need to get some grip on how the right to privacy can be enforced as technologies continue to develop that can pose serious threats to individuals’ sense of dignity, reputation, privacy and safety,” he adds.</p> <p>One enforcement Erdos refers to is <em>Google Spain</em>, a ruling made in 2014 by the Court of Justice of the European Union (CJEU) that examined search engines’ responsibilities when sharing content about us on the world wide web.</p> <p> ֱ̽CJEU ruled that people across all of the 28 EU Member States have a ‘right to be forgotten’ online, giving them an ability to prohibit search engines indexing inadequate, irrelevant or other illegal information about them against their name. This right to be forgotten is based on Europe’s data protection laws and applies to all online information about a living person.</p> <p>Google responded by publishing a form you can submit to have such links to content (not the actual content) removed. I put it to the test – Google refuses on the basis that web links to my long-closed business are ‟justified” as they ‟may be of interest to potential or current consumers”.</p> <p>Erdos explains that data protection doesn’t always work as it was originally intended to. “On paper, the law is in favour of privacy and the protection of individuals – there are stringent rules around data export, data transparency and sensitive data, for example.</p> <p>“But that law was in essence developed in the 1970s, when there were few computers. Now we have billions of computers, and the ease of connectivity of smartphones and the internet.  Also, sharing online is not practically constrained by EU boundaries.</p> <p>“That means the framework is profoundly challenged. There needs to be a more contextual legal approach, where the duties and possibly also the scope take into account risk as well as the other rights and interests that are engaged.  That law must then be effectively enforced.”</p> <p>In fact, the EU data protection law currently extends surprisingly far. “By default, the law regulates anyone who alone, or jointly with others, does anything with computerised information that mentions a living person,” Erdos explains. “That could include many individuals on social networking sites. If you’re disseminating information about a third party to an indeterminate number of people, you’re (in theory at least) responsible for adherence to this law.”</p> <p>Tweeters, for instance, may have to respond to requests for data (Tweets) to be rectified for inaccuracy or even removed entirely, and field ‘subject access requests’ for full lists of everything they’ve Tweeted about someone. And under the new General Data Protection Regulation that comes into effect in 2018, the maximum penalty for an infringement is €20 million (or, in the case of companies, up to 4% of annual global turnover).</p> <p>When it comes to search engines or social media, Erdos admits that a strict application of the law is “not very realistic”. He adds: “There’s a systemic problem in the gap between the law on the books and the law in reality, and the restrictions are not desperately enforced.”</p> <p>Erdos believes inconsistencies in the law could be exploited online by the ruthless. “ ֱ̽very danger of all-encompassing, stringent laws is that it seems as if responsible organisations and individuals who take them seriously are hamstrung while the irresponsible do whatever they want.”</p> <p>This also applies to ‘derogations’ – areas where the law instructs a balance must be struck between data protection and the rights to freedom of journalistic, literary and artistic expression.</p> <p>“Member states have done radically different things in their formal law here – from nothing at all through to providing a blanket exception – neither of which was the intention of the EU scheme.”</p> <p>As the new law in 2018 will empower regulators to hand out fines of up to hundreds of millions of euros to large multinational companies, Erdos is passionate about the urgency of Europe getting a coordinated and clear approach on how its citizens can exercise their data protection rights.</p> <p>“We are giving these regulators quite enormous powers to enforce these rules and yet do we have a good understanding of what we want the outcome to be and what we’re expecting individuals and organisations to do?” Erdos ponders.</p> <p>“To me, this means that the enforcement will become more and more important. Data protection is not just a technical phrase – people really do need protection. ֱ̽substance of the law needs to be hauled into something that’s more reasonable. That protection needs to be made real.”</p> <p>Erdos’ research also explores the nature of data protection and academic freedom, and he successfully argued for academic expression to be added to the list of free speech derogations in the 2018 legislation. “I have come across the most egregious examples of research guidance stipulating alleged data protection requirements, including claims that published research can’t include any identifiable personal data at all,” says Erdos.</p> <p>“In a survey of EU data protection authorities, I asked whether a journalist’s undercover investigation into extremist political beliefs and tactics and an academic’s undercover research into widespread claims of police racism could be legal under data protection. Not one regulator said the activity of the journalist would in principle be illegal, but almost half said the academic’s activity would be unlawful.</p> <p> “Academics aim to write something of public importance, and make it rigorous. ֱ̽old law was seen to prioritise even tittle-tattle in a newspaper over academic research; one hopes this will largely be removed by the new law.”</p> <p>For many, the greatest concern remains the potential threats to their privacy. In order for consumers to feel safe with emerging technology, law makers may have to legislate for potential breaches now, rather than react after the damage is done.</p> <p>“We don’t want to respond in a panic of registering or documenting everything, but the alternative of collapse into an ‘anything goes’ situation is equally dangerous.</p> <p>“Apps like Snapchat show many people value being able to upload certain pictures and information that soon disappear. We don’t want people forgetting what they’re sharing today, and then worrying far too late how third parties are using that information.”</p> <p>Would Erdos himself ever use Snapchat Spectacles or Google Glass (he does own a smartphone)? He laughs. “Let’s face it, email, the internet, Google search… people ended up having to use them. So, never say never!”</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Many of us see our privacy as a basic right. But in the digital world of app-addiction, geolocation tracking and social oversharing, some may have cause to wonder if that right is steadily and sometimes willingly being eroded away.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">You can see that we need to get some grip on how the right to privacy can be enforced as technologies continue to develop that can pose serious threats to individuals’ sense of dignity, reputation, privacy and safety</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">David Erdos</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.flickr.com/photos/nolifebeforecoffee/124659356/in/photolist-c1UTf-61hZ1e-6QwoaX" target="_blank">nolifebeforecoffee</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Banksy stencil</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution-sharealike">Attribution-ShareAlike</a></div></div></div> Fri, 28 Oct 2016 13:35:03 +0000 ts657 180522 at Talkin' 'bout a revolution: how to make the digital world work for us /research/discussion/talkin-bout-a-revolution-how-to-make-the-digital-world-work-for-us <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/discussion/160921world-connectivityeric-fischerarticle.jpg?itok=sbYwOAXc" alt="World travel and communications recorded on Twitter" title="World travel and communications recorded on Twitter, Credit: Eric Fischer" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>New information and communication technologies are having a profound impact on many aspects of social, political and economic life, raising important new issues of social and public policy. Surveillance, privacy, data protection, advanced robotics and artificial intelligence are only a few of the many fundamental issues that are now forcing themselves onto the public agenda in many countries around the world.</p> <p>There have been other great technological revolutions in the past but the digital revolution is unprecedented in its speed, scope and pervasiveness. Today, less than a decade after smartphones were first introduced, around half the adult population in the world owns one – and by 2020, according to some estimates, 80% will.</p> <p>Smartphones are, of course, much more than phones: they are powerful computers that we carry around in our pockets and handbags and that give us permanent mobile connectivity. While they enable us to do an enormous range of things, from checking and sending emails to ordering a taxi, using a map and paying for a purchase, they also know a lot about us – who we are, where we are, which websites we visit, what transactions we’ve made, whom we’re communicating with, and so on. They are great enablers but also powerful generators of data about us, some of which may be very personal. Do we know who has access to this data? Do we know what they do with it? Do we care?</p> <p> ֱ̽rapid rise and global spread of the smartphone is just one manifestation of a technological revolution that is a defining feature of our time. No one in the world today is beyond its reach: the everyday act of making a phone call or using a credit card immediately inserts you into complex global networks of digital communication and information flow.</p> <p>In fact the digital revolution is often misunderstood because it is equated with the internet and yet is much more than this. It involves several interconnected developments: the pervasive digital codification of information; the dramatic expansion of computing power; the integration of information technologies with communication systems; and digital automation or robotics.</p> <p>Taken together, these developments are spurring profound changes in all spheres of life, from industry and finance to politics, from the nature of public debate to the character of personal relationships, disrupting established institutions and practices, opening up new opportunities and creating new risks.</p> <p>In Cambridge, an ambitious new interdisciplinary collaboration around ‘digital society’ is being forged to bring together social scientists and computer scientists to tackle some of the big questions raised by the digital revolution.</p> <p> ֱ̽key idea underlying the collaboration is that some of the most important intellectual challenges in this emerging area require <em>both </em>a firm grasp of technology <em>and </em>a deep understanding of processes that are fundamentally social and political in character.</p> <p>Cambridge is uniquely well-placed to tackle these challenges. As a world-leading university in computer science and technology, the ֱ̽ has been at the forefront of some of the most important developments in this field. Cambridge is also a leading research and development centre for the IT industry. Several significant technology companies are based here, including Microsoft Research, ARM and a sizable number of smaller companies and start-ups. There is also a large group of scholars and researchers in Cambridge in the social sciences and law who are working on aspects of the digital revolution.</p> <p>By bringing together social scientists and computer scientists on specific research projects, we are forging a new form of interdisciplinary collaboration that will enable us to grapple with some of the big challenges posed by the digital revolution (see panel).</p> <p>These endeavours dovetail well with research initiatives that are already under way in Cambridge, including the Leverhulme Centre for the Future of Intelligence,  the Cambridge Cybercrime Centre and the ֱ̽’s Strategic Research Initiatives and Networks on Big Data, Public Policy, Public Health and Digital Humanities. Cambridge is also a key partner in the UK’s national centre for data science, the Alan Turing Institute, and in the Horizon Digital Economy programme, which aims to tackle the challenge of harnessing the power of ubiquitous computing in a way that is acceptable to society.</p> <p>While the collaborative work carried out in Cambridge is primarily research-oriented, it is also likely to have significant practical implications. Cambridge has a strong track record in producing world-leading research that feeds directly into real-world applications. As examples, software systems Docker and the Xen hypervisor developed in the Computer Laboratory now run much of the public cloud computing infrastructure, and Raspberry Pi is widely used in technology education in schools.</p> <p>We are living through a time of enormous social, political and technological change. On the one hand, the digital revolution is enabling massive new powers to be exercised by states and corporations in ways that were largely unforeseen. And, on the other, it is giving rise to new forms of mobilisation and disruption from below by a variety of actors who have found new ways to organise and express themselves in an increasingly networked world. While these and other developments are occurring, the traditional institutions of democratic governance find themselves ill-equipped to understand and keep pace with the new social and technological landscapes that are rapidly emerging around them.</p> <p>There is no better moment, in our view, to bring together social scientists and computer scientists to tackle the big questions raised by one of the most profound and far-reaching revolutions of our time.</p> <p><em>Jon Crowcroft is the Marconi Professor of Communications Systems at Cambridge’s Computer Laboratory and Professor John Thompson is at the Department of Sociology.</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽digital revolution is one of the great social transformations of our time. How can we make the most of it, and also minimise and manage its risks? Jon Crowcroft and John Thompson discuss the challenges as we commence a month-long focus on ‘digital society’.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">There is no better moment to bring together social scientists and computer scientists to tackle the big questions raised by one of the most profound and far-reaching revolutions of our time</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Jon Crowcroft and John Thompson</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.flickr.com/photos/walkingsf/6635655755/in/photolist-b7ntgR-qrtG5H-a3RMm8-8QDuqn-4HfejV-c1iJvo-aHYvxz-pNda2g-fy75Mn-csC4fm-quUxLc-qsV31F-bXnQGG-4oRM7D-nAyZ1V-dRHqgM-aDBcpW-g8WA4s-q7PUmV-r4sXCp-bFSQUk-oFBNRz-qdwQn6-4mHFXo-qiT3fH-oYVLPe-772ZsC-djTuqy-qiT1dX-qdxmaJ-dPcqJ1-mHXecR-br1nSQ-qiT1Fk-7tDRuu-4oRUnZ-djYoW3-GDbhK6-q1Ergk-nXpRT6-6Zg2UR-qN7RaF-e8sgnP-bXnZgU-4Xtgen-sym1iC-nF3q89-rVieQH-8qaozb-pHL7j" target="_blank">Eric Fischer</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">World travel and communications recorded on Twitter</div></div></div><div class="field field-name-field-panel-title field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Key challenges for digital society</div></div></div><div class="field field-name-field-panel-body field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><ul> <li>What are the consequences of permanent connectivity for the ways that individuals organise their day-to-day lives, interact with others, form social relationships and maintain them over time? </li> <li>What implications do these transformations have for traditional forms of political organisation and communication? Are they fuelling alternative forms of social and political mobilisation, facilitating grass-root movements and eroding trust in established political institutions and leaders?</li> <li>What are the implications for privacy of the increasing capacity for surveillance afforded by global networks of communication and information flow? Do individuals in different parts of the world value privacy in the same way, or is this a distinctively Western preoccupation?</li> <li>How is censorship exercised on the internet? What forms does it assume and what kinds of material are censored? How do censorship practices vary from one country to another? To what extent are individuals aware of censorship and how do they cope with it?</li> <li>Just as the internet creates new opportunities for states and other organisations to exercise surveillance and censorship, so too it enables individuals and other organisations to disclose information that was previously hidden from view and to hold governments and corporations to account: who are the digital whistleblowers, how effective are they and what are the consequences of the new forms of transparency and accountability that they, among others, are developing?</li> <li>What techniques do criminals use to deceive users online, how widespread are their activities and what can users do to avoid getting caught in their traps?</li> <li>What impact is the digital revolution – including developments in artificial intelligence and machine learning – having on traditional industries and forms of employment, and what impact is it likely to have in the coming years? Will it usher in a new era of mass unemployment in which professional occupations as well as manual jobs are displaced by automation, as some fear? </li> <li>What are the implications of the pervasive digitisation of intellectual content for our traditional ways of thinking about intellectual property and our traditional legal mechanisms for regulating intellectual property rights?</li> <li>How widespread are new forms of currency that exist only online – so-called cryptocurrencies like Bitcoin – and what impact are they likely to have on traditional financial practices and institutions?</li> <li>How are new forms of data analysis and advanced robotics affecting the practice of medicine, the provision of healthcare and the detection and control of disease, and how might they affect them in the future?</li> </ul> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution-noncommercial-sharealike">Attribution-Noncommercial-ShareAlike</a></div></div></div> Mon, 03 Oct 2016 13:45:18 +0000 Anonymous 179012 at Opinion: How a comic character sparked our very modern privacy fears – 200 years ago /research/news/opinion-how-a-comic-character-sparked-our-very-modern-privacy-fears-200-years-ago <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/discussion/160224paulpry.jpg?itok=4YNGwiAx" alt="English actor John Liston as the title character in John Poole&#039;s 1825 farce, &quot;Paul Pry&quot;" title="English actor John Liston as the title character in John Poole&amp;#039;s 1825 farce, &amp;quot;Paul Pry&amp;quot;, Credit: Folger Shakespeare Library, Washington, DC (Julie Ainsworth, photographer)" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><blockquote>&#13; <p>We live in a time where there is no longer any privacy. Everything is recorded and shared, permanently available to those who pry or, as they may think of it, research.</p>&#13; </blockquote>&#13; &#13; <p>While AS Byatt <a href="https://www.theguardian.com/books/2016/feb/12/i-am-no-one-patrick-flanery-review">wrote</a> this just recently, the debate about privacy is not a new phenomenon. Back in 1825, the theatrical sensation of the year was a comedy entitled <a href="http://www.ephemera-society.org.uk/items/2012/jan12.html">Paul Pry</a> which played to packed houses in London, throughout the provinces, and by the following year as far away as New York. ֱ̽drama became a matter of public debate through its central character’s catchphrase, “I hope I don’t intrude”, which appeared not just in the rhetoric of politicians and commentators, but also printed onto various objects in an early example of merchandising.</p>&#13; &#13; <p>In the play, the eponymous hero was constantly prying into the domestic affairs of his neighbours, either by eavesdropping or by intercepting their letters. At every turn he misunderstood the secrets he had acquired, generating an escalating confusion in relations between lovers and between parents and children.</p>&#13; &#13; <p> ֱ̽phrase expressed the increasing ambivalence of popular attitudes towards privacy. It conveyed the new enthusiasm for inquiry, the sense that advances in education and the expansion of the media were creating a new era of transparency and informed debate. ֱ̽state began to sponsor elementary schools in 1833 and three years later lifted what was described as the “tax on knowledge” – a stamp imposed on newspapers to put them out of reach of working-class readers. In 1840 it sought to widen access to communication by introducing the <a href="http://www.postalheritage.org.uk/explore/history/pennyblack/">flat-rate, pre-paid Penny Black stamp</a>, which cost a penny irrespective of distance.</p>&#13; &#13; <p>At the same time, there was an anxiety that the realm of private communication was under threat from new forms of surveillance. As the family was increasingly thrust to the foreground as the core of morality and discipline, it seemed that its capacity to keep its own secrets was coming under attack.</p>&#13; &#13; <p>Matters came to a head in 1844 when the recently launched satirical magazine, <a href="https://www.punch.co.uk/">Punch</a>, published a cartoon depicting the Home Secretary, Sir James Graham, dressed as Paul Pry, standing in the new Post Office Headquarters eagerly opening the mail that was passing through the system in ever-increasing volumes. ֱ̽government had been accused of intercepting the correspondence of suspected Italian republicans at the behest of the Austrian government. Coming just four years after the costly decision to democratise the post, it stood charged of <a href="http://historyandpolicy.org/policy-papers/papers/surveillance-privacy-and-history">exploiting new opportunities for invading the private realm</a>. ֱ̽Lord Chief Justice depicted the Home Secretary “opening a private letter, becoming the depository of the secrets of a private family … meeting an individual in society and knowing that he was in possession of secrets dearer to him than his life.”</p>&#13; &#13; <p> ֱ̽Home Secretary’s career never recovered from the controversy. ֱ̽event, wrote his biographer was:</p>&#13; &#13; <blockquote>&#13; <p>like a match struck for a moment amid profound darkness, revealing to the startled crowd vague forms of terror, of which they had never previously had a glimpse … and about which they forthwith began to talk at random, until a gigantic system of espionage had been conjured up which no mere general assurance of its unreality could dispel.</p>&#13; </blockquote>&#13; &#13; <p> ֱ̽affair marked the beginning of the inclination to cast the privacy debate in magnified terms: from a handful of Italian exiles to all those who sent a letter. ֱ̽postal network exposed the privacy of every citizen, and surveillance embraced not just their political views, but every aspect of their domestic lives committed to paper.</p>&#13; &#13; <p>Adding a very modern twist was the government’s refusal to confirm or deny the charge made against it, inventing the doctrine – which has been followed in the centuries up to Edward Snowden’s revelations – of refusing to comment on its use of surveillance technologies. This had the advantage of keeping the press and readers at bay, leaving successive governments free to extend their operations in response to unforeseeable threats to the security of the state. On the other hand, it prevented the government from denying what it had not done, leaving the field open to conspiracy theorists.</p>&#13; &#13; <p>Today, despite belated attempts to enshrine the powers of the security services in law through the <a href="https://theconversation.com/investigatory-powers-bill-will-remove-isps-right-to-protect-your-privacy-50178">Investigatory Powers Bill</a>, the structures of surveillance remain wilfully opaque. <a href="https://terrorismlegislationreviewer.independent.gov.uk/about-me/">David Anderson QC</a>, reviewer of the government’s counter-terrorism legislation, <a href="https://www.theguardian.com/world/2015/jun/11/uk-intelligence-agencies-should-keep-mass-surveillance-powers-report-gchq">wrote</a> that the Regulation of Investigatory Powers Act is “obscure since its inception, has been patched up so many times as to make it incomprehensible to all but a tiny band of initiates”. ֱ̽current draft of the Investigatory Powers Bill claims to initiate a new regime of clarity, yet ensures that that the grounds for steaming open electronic letters are written in terms that allow the broadest interpretation.</p>&#13; &#13; <p>For 19th century families, as for those of today, maintaining control over personal communication was a matter of constant adjustment and compromise, of small victories and passing defeats. But today, privacy has become a controversy played out in public – fuelled by revolutions in our means of communication, and conditioned by governments forever inclined to keep secrecy secret.</p>&#13; &#13; <p><em><strong><span><a href="https://theconversation.com/profiles/david-vincent-230840">David Vincent</a>, Visiting Fellow in Technology and Democracy, CRASSH, <a href="https://theconversation.com/institutions/university-of-cambridge-1283"> ֱ̽ of Cambridge</a></span></strong></em></p>&#13; &#13; <p><em><strong>This article was originally published on <a href="https://theconversation.com/"> ֱ̽Conversation</a>. Read the <a href="https://theconversation.com/how-a-comic-character-sparked-our-very-modern-privacy-fears-200-years-ago-55076">original article</a>.</strong></em></p>&#13; &#13; <p><em> ֱ̽opinions expressed in this article are those of the individual author(s) and do not represent the views of the ֱ̽ of Cambridge.</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>David Vincent (CRASSH) discusses the nineteenth century theatrical sensation that inspired public debate about privacy.</p>&#13; </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://commons.wikimedia.org/wiki/File:John_Liston_as_Paul_Pry_circa_1825.jpg" target="_blank">Folger Shakespeare Library, Washington, DC (Julie Ainsworth, photographer)</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">English actor John Liston as the title character in John Poole&#039;s 1825 farce, &quot;Paul Pry&quot;</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution-sharealike">Attribution-ShareAlike</a></div></div></div> Thu, 25 Feb 2016 14:17:46 +0000 Anonymous 168232 at