ֱ̽ of Cambridge - Julia Powles /taxonomy/people/julia-powles en DeepMind-Royal Free deal is “cautionary tale” for healthcare in the algorithmic age /research/news/deepmind-royal-free-deal-is-cautionary-tale-for-healthcare-in-the-algorithmic-age <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/cropforweb_5.jpg?itok=0F7_fjVc" alt="DeepMind acquired the data for software that could send clinicians alerts about patients at risk of Acute Kidney Injury, but the agreement also gave it access to a substantial number of records about unaffected patients. " title="DeepMind acquired the data for software that could send clinicians alerts about patients at risk of Acute Kidney Injury, but the agreement also gave it access to a substantial number of records about unaffected patients. , Credit: NEC Corporation of America" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Researchers studying a deal in which Google’s artificial intelligence subsidiary, DeepMind, acquired access to millions of sensitive NHS patient records have warned that more must be done to regulate data transfers from public bodies to private firms.</p>&#13; &#13; <p><a href="https://link.springer.com/article/10.1007/s12553-017-0179-1"> ֱ̽academic study</a> says that “inexcusable” mistakes were made when, in 2015, the Royal Free NHS Foundation Trust in London signed an agreement with Google DeepMind. This allowed the British AI firm to analyse sensitive information about 1.6 million patients who use the Trust’s hospitals each year.</p>&#13; &#13; <p> ֱ̽access was used for monitoring software for mobile devices, called Streams, which promises to improve clinicians’ ability to support patients with Acute Kidney Injury (AKI). But according to the study’s authors, the purposes stated in the agreement were far less specific, and made more open-ended references to using data to improve services.</p>&#13; &#13; <p>More than seven months after the deal was put in place, an investigation by <a href="https://www.newscientist.com/article/2086454-revealed-google-ai-has-access-to-huge-haul-of-nhs-patient-data/">New Scientist</a> then revealed that DeepMind had gained access to a huge number of identifiable patient records and that it was not possible for the public to track how these were being used. They included information about people who were HIV-positive, details about drug overdoses and abortions, and records of routine hospital visits.</p>&#13; &#13; <p>As of November 2016, DeepMind and the Trust have replaced the old agreement with a new one. ֱ̽original deal is being investigated by the Information Commissioner’s Office (ICO), which has yet to report any findings publicly. ֱ̽National Data Guardian (NDG) is also continuing to look into the arrangement. DeepMind retained access to the data that it had been given even after the ICO and NDG became involved, and the app is being deployed.</p>&#13; &#13; <p>Both the Trust and DeepMind have disputed the findings in the new study. In a joint response they claimed that the paper misrepresented the NHS’s use of technology to process data and also that the analysis contained several errors. They stressed that the Streams app was making a significant difference to hospital staff, and highlighted its life-saving potential.</p>&#13; &#13; <p> ֱ̽authors, however, said that these accusations of factual inaccuracy and analytical error were unsubstantiated, and that <a href="https://link.springer.com/article/10.1007/s12553-017-0179-1">their article</a> makes clear why the agreement is unusual and in the public interest.</p>&#13; &#13; <p> ֱ̽study reviews the original agreement in depth, with a systematic synthesis of publicly available documentation, statements, and other details obtained by Freedom of Information requests. It was carried out by Dr Julia Powles, a Research Associate in law and computer science at St John’s College, ֱ̽ of Cambridge, and Hal Hodson, who broke the New Scientist story and is now Technology Correspondent for ֱ̽Economist.</p>&#13; &#13; <p>Both authors say that it is unlikely that DeepMind’s access ever represented a data security risk, but that the terms were nonetheless highly questionable, in particular because they lacked transparency and suffered from an inadequate legal and ethical basis for Trust-wide data access.</p>&#13; &#13; <p>They say the case should be a “cautionary tale” for the NHS and other public institutions, which are increasingly seeking tech companies’ help to improve services, but could, in the process, surrender substantial amounts of sensitive information, creating “significant power asymmetries between citizens and corporations”.</p>&#13; &#13; <p>“Data analytics and machine learning in general offer promise in improving healthcare and clearly digital technology companies will have a role to play,” Powles said. “To the extent that it signals a move in this direction, we think that there were inadequacies in the case of this particular deal.”</p>&#13; &#13; <p>“ ֱ̽deal betrays a level of naivety regarding how public sector organisations set up data-sharing arrangements with private firms, and it demonstrates a major challenge for the public and public institutions. It is worth noting, for example, that in this case DeepMind, a machine learning company, had to make the bizarre promise that it would not yet use machine learning, in order to engender trust.”</p>&#13; &#13; <p>Powles and Hodson argue that the transfer of data to DeepMind did not proceed as it should have, questioning in particular its invocation of a principle known as “direct care”. This assumes that an “identified individual” has given implied consent for their information to be shared for uses that involve the prevention, investigation, or treatment of illness.</p>&#13; &#13; <p>No patient whose data was shared with DeepMind was ever asked for their consent. Although direct care would clearly apply to those monitored for AKI, the records that DeepMind received covered every other patient who used the Trust’s hospitals. These extended to people who had never been tested or treated for kidney injuries, people who had left the catchment area, and even some who had died.</p>&#13; &#13; <p>In fact, the authors note that, according to the Royal Free and DeepMind’s own announcements, only one in six of the records DeepMind accessed would have involved AKI patients. For a substantial number of patients, therefore, the relationship was indirect. As a result, special permissions should have been sought from the Government, and agencies such as the ICO and NDG should have been consulted. This did not happen.</p>&#13; &#13; <p>Such applications of “direct care” have been queried before. In December 2016, Dr Alan Hassey of the NDG, which provides national guidance on the use of confidential information, <a href="https://www.gov.uk/government/speeches/reasonable-expectations">wrote that</a>: “an erroneous belief has taken hold in some parts of the health and care system that if you believe what you are doing is direct care, you can automatically share information on a basis of implied consent”. Dr Hassey noted that direct care is not “of itself a catch-all… ֱ̽crucial thing is that information sharing must be in line with the reasonable expectations of the individual concerned”.</p>&#13; &#13; <p> ֱ̽researchers’ survey also criticises the lack of transparency in the agreement, pointing out that neither party made clear the volume of data involved, nor that it involved so many identifiable records. How that data has been, and is being, used, has never been independently scrutinised. Last week, DeepMind announced plans to develop a new data-tracking system, to make such processes more transparent, at an unspecified future stage.</p>&#13; &#13; <p> ֱ̽authors liken the relationship overall to a one-way mirror. “Once our data makes its way onto Google-controlled servers, our ability to track it – to understand how and why decisions are made about us – is at an end,” they write.</p>&#13; &#13; <p> ֱ̽paper says that an obvious lesson is that no such deal should be launched without full disclosure of the framework of documents and approvals which underpins it. In light of the 2013 Caldicott review of sharing of patient information, they write that: “ ֱ̽failure of both sides to engage in any conversation with patients and citizens is inexcusable.”</p>&#13; &#13; <p>They also suggest that private companies should have to account for their use of public data to properly-resourced and independent bodies. Without this, they argue that tech companies could gradually gain an unregulated monopoly over health analytics.</p>&#13; &#13; <p>“ ֱ̽reality is that the exact nature and extent of Google’s interests in NHS patient data remain ambiguous,” the authors add. Powles notes that while Google has no stated plans to exploit the data for advertising and other commercial uses, its unparalleled access to such information, without any meaningful oversight, does not rule out the possibility in future.</p>&#13; &#13; <p>“I personally think that because data like this can get out there, we are almost becoming resigned to the idea,” Powles added. “This case stresses that we shouldn’t be. Before public institutions give away longitudinal data sets of our most sensitive details, they should have to account to a comprehensive, forward-thinking and creative regulatory system.”</p>&#13; &#13; <p>A spokesman for both the Royal Free London and DeepMind said that both organisations were “committed to working together to support world class care for patients”. He added: “Every trust in the country uses IT systems to help clinicians access current and historic information about patients under the same legal and regulatory regime.”</p>&#13; &#13; <p>Powles and Hodson are working on a second paper, analysing the terms of the revised DeepMind-Royal Free arrangement since November 2016 and the ongoing regulatory investigations. Their current study is published in the journal Health and Technology. </p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A study of a deal which has allowed Google DeepMind access to millions of healthcare records argues that more needs to be done to regulate such agreements between public sector bodies and private technology firms.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"> ֱ̽deal betrays a level of naivety regarding how public sector organisations set up data-sharing arrangements with private firms, and it demonstrates a major challenge for the public and public institutions</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Julia Powles</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.flickr.com/photos/neccorp/14445634744/in/photolist-o1vDW1-nJ7P9M-o3oLa4-apHgyH-9oq9gd-dAbMJi-aptNKm-7oad5w-apr6M4-6FHkXS-6FHo9w-aptNNS-bNBdin-9ZZRS1-76RHXF-apKYMA-8mxzUT-5z9pa5-6ugLNZ-8ruvhJ-bNCY7V-6iBcpH-6sAzTr-5ZBWqh-5yZvrM-b6p5F2-nJ8QAF-bq7tvW-8cKTd3-8VyCoj-7RuUbq-Kzbc9-86r8dv-6Fpoeb-b61x2Z-rpZqdH-FWseph-8k36sE-8k36ub-GmyAok-4LuYhd-eixtvY-JBGQC-eZriSJ-92kY7W-7unjAj-7uiryP-ej5sQW-5jp94p-5xyteY" target="_blank">NEC Corporation of America</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">DeepMind acquired the data for software that could send clinicians alerts about patients at risk of Acute Kidney Injury, but the agreement also gave it access to a substantial number of records about unaffected patients. </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution">Attribution</a></div></div></div> Thu, 16 Mar 2017 12:30:01 +0000 tdk25 186212 at Combating cybercrime when there's plenty of phish in the sea /research/features/combating-cybercrime-when-theres-plenty-of-phish-in-the-sea <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/161020teqis-graffitti-phishlasthuckleberry.jpg?itok=sC6xqJpZ" alt="" title="TeQi&amp;#039;s Graffitti Phish, Credit: LastHuckleBerry" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>We’ve all received the emails, hundreds, maybe thousands of them. Warnings that our bank account will be closed tomorrow, and we’ve only got to click a link and send credit card information to stop it from happening. Promises of untold riches, and it will only cost a tiny fee to access them. Stories of people in desperate circumstances, who only need some kind soul to go to the nearest Western Union and send a money transfer to save them.</p> <p>Tricking people into handing over sensitive information such as credit card details – known as ‘phishing’ – is one of the ways criminals scam people online. Most of us think we’re smarter than these scams. Most of us think that we could probably con the con artist if we tried. But we would be wrong.</p> <p>Across the world, cybercrime is booming. When the UK government included cybercrime in the national crime statistics for the first time in 2015, it doubled the crime rate overnight. Millions of people worldwide are victimised by online scams, whether it’s blocking access to a website, stealing personal or credit card information, or attempting to extort money by remotely holding the contents of a personal computer hostage.</p> <p>“Since 2005, the police have largely ignored cybercrime,” says Professor Ross Anderson of Cambridge’s Computer Laboratory. “Reported crime fell by as much as a half in some categories. Yet, now that online and electronic fraud are included, the number of reported crimes has more than doubled. Crime was not falling; it was just moving online.”</p> <p>In 2015, computer scientists, criminologists and legal academics joined forces to form the <a href="https://www.cambridgecybercrime.uk/">Cambridge Cybercrime Centre</a>, with funding from the Engineering and Physical Sciences Research Council. Their aim is to help governments, businesses and ordinary users to construct better defences.</p> <p>To understand how the criminals operate, researchers use machine learning and other techniques to recognise bad websites, understand what kinds of brands tend to be attacked and how often, determine how many criminals are behind an attack by looking at the pattern of the creation of fake sites and how effective the various defence systems are at getting them taken down.</p> <p>One way in which studying cybercrime differs from many other areas of research is that the datasets are difficult to come by. Most belong to private companies, and researchers need to work hard to negotiate access. This is generally done through nondisclosure agreements, even if the data is out of date. And once researchers complete their work, they cannot make the data public, since it would reduce the competitive advantage of corporate players, and it may also make it possible for criminals to reverse engineer what was detected (and what wasn’t) and stay one step ahead of law enforcement.</p> <p>One of the goals of the Cambridge Cybercrime Centre is to make it easier for cybercrime researchers from around the world to get access to data and share their results with colleagues.</p> <p>To open up cybercrime research to colleagues across the globe, the team will leverage their existing relationships to collect and store cybercrime datasets, and then any bona fide researcher can sign a licence with the Centre and get to work without all the complexity of identifying and approaching the data holders themselves.</p> <p>“Right now, getting access to data in this area is incredibly complicated,” says Dr Richard Clayton of Cambridge’s Computer Laboratory, who is also Director of the Centre. “But we think the framework we’ve set up will create a step change in the amount of work in cybercrime that uses real data. More people will be able to do research, and by allowing others to work on the same datasets more people will be able to do reproducible research and compare techniques, which is done extremely rarely at the moment.”</p> <p>One of the team helping to make this work is Dr Julia Powles, a legal researcher cross-appointed between the Computer Laboratory and Faculty of Law. “There are several hurdles to data sharing,” says Powles. “Part of my job is to identify which ones are legitimate – for example, when there are genuine data protection and privacy concerns, or risks to commercial interests – and to work out when we are just dealing with paper tigers. We are striving to be as clear, principled and creative as possible in ratcheting up research in this essential field.”</p> <p>Better research will make for better defences for governments, businesses and ordinary users. Today, there are a lot more tools to help users defend themselves against cybercrime – browsers are getting better at recognising bad URLs, for example – but, at the same time, criminals are becoming ever more effective, and more and more people are getting caught in their traps.</p> <p>“You don’t actually have to be as clever as people once thought in order to fool a user,” says Clayton when explaining how fake bank websites are used to ‘phish’ for user credentials. “It used to be that cybercriminals would register a new domain name, like Barclays with two Ls, for instance. But they generally don’t do that for phishing attacks anymore, as end users aren’t looking at the address bar, they’re looking at whether the page looks right, whether the logos look right.”</p> <p> ֱ̽Centre is also looking at issues around what motivates someone to commit cybercrime, and what makes them stop.</p> <p>According to Dr Alice Hutchings, a criminologist specialising in cybercrime, cybercriminals tend to fall into two main categories. ֱ̽first category is the opportunistic offender, who may be motivated by a major strain in their lives, such as financial pressures or problems with gambling or addiction, and who uses cybercrime as a way to meet their goals. ֱ̽second type of offender typically comes from a more stable background, and is gradually exposed to techniques for committing cybercrime through associations with others.</p> <p>Both groups will usually keep offending as long as cybercrime meets their particular needs, whether it’s financial gratification, or supporting a drug habit, or giving them recognition within their community. What often makes offenders stop is the point at which the costs of continuing outweigh the benefits: for instance, when it takes a toll on their employment, other outside interests or personal relationships.</p> <p>“Most offenders never get caught, so there’s no reason to think that they won’t go back to cybercrime,” says Hutchings. “They can always start again if circumstances in their lives change.</p> <p>“There is so much cybercrime happening out there. You can educate potential victims, but there will always be other potential victims, and new ways that criminals can come up with to social engineer somebody’s details, for example. Proactive prevention against potential offenders is a good place to start.”</p> <p>Criminologist Professor Lawrence Sherman believes the collaboration between security engineering and criminology is long overdue, both at Cambridge and globally: “Cybercrime is the crime of this century, a challenge we are just beginning to understand and challenge with science.”</p> <p>“We’re extremely grateful to the people giving us this data, who are doing it because they think academic research will make a difference,” says Clayton.  “Our key contribution is realising that there was a roadblock in terms of being able to distribute the data. It’s not that other people couldn’t get the data before, but it was very time-consuming, so only a limited number of people were doing research in this area – we want to change that.”</p> <p>“Our Cybercrime Centre will not only provide detailed technical information about what’s going on, so that firms can construct better defences,” says Anderson. “It will also provide strategic information, as a basis for making better policy.”</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>As more and more crime moves online, computer scientists, criminologists and legal academics have joined forces in Cambridge to improve our understanding and responses to cybercrime, helping governments, businesses and ordinary users construct better defences.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">You don’t actually have to be as clever as people once thought in order to fool a user</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Richard Clayton</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.flickr.com/photos/zippidyserendipity/16423188579/in/photolist-r2g8MM-2Trrxt-2Tr8Rc-fot6Xg-9Q6RQu-2TvRhf-2Tr8Nr-a56GGq-9deUiG-JNHovd-JRJrcK-2TriKX-78okxd-2TvLa9-JqYBqh-HVpqjy-2TvPVu-HVkJRR-qZmSti-2TvRo3-JGNDnE-2Tvxr9-2TvLKw-JGJU15-2TvNXY-2Trj1B-2TriVk-JRXjF2-pL2PUE-GpB4w2-2Trpdz-a8D7vn-6vHa6F-2TvPnL-JNHnm9-6aPh2c-Jr8Sps-JNHmzQ-HVCauh-2TvAm1-2Trrii-2TvMkd-2TvMbG-2TvR79-2TrpPM-a54xrr-2TvRS9-2TvGLY-2TrcDB-2TroSz" target="_blank"> LastHuckleBerry</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">TeQi&#039;s Graffitti Phish</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution-sharealike">Attribution-ShareAlike</a></div></div></div><div class="field field-name-field-related-links field-type-link-field field-label-above"><div class="field-label">Related Links:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="https://www.cambridgecybercrime.uk/">Cambridge Cybercrime Centre</a></div></div></div> Fri, 21 Oct 2016 07:51:23 +0000 sc604 180172 at