探花直播 of Cambridge - Google /taxonomy/external-affiliations/google en Cambridge and Google partner to facilitate AI research /news/cambridge-and-google-partner-to-facilitate-ai-research <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/news/chia.jpg?itok=KxVv2SRd" alt="Research underway in the Centre for Human-Inspired Artificial Intelligence" title="Research underway in the Centre for Human-Inspired Artificial Intelligence, Credit: Cambridge FilmWorks" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> 探花直播new multi-year research agreement creates the potential for researchers and scientists from Google and the 探花直播 to more closely collaborate on foundational AI research projects in areas of shared interest across a range of disciplines, including climate and sustainability, and AI ethics and safety.聽</p>&#13; &#13; <p>Google has also become the first funding partner for the university鈥檚 <a href="https://www.chia.cam.ac.uk/">Centre for Human-Inspired Artificial Intelligence (CHIA)</a>, led by Professor Anna Korhonen, Professor聽Per Ola Kristensson and Dr. John Suckling, bringing together researchers and experts from computer science, engineering and multiple disciplines to develop AI that is grounded in human values and benefits humanity. Google鈥檚 unrestricted grant is helping enable the Centre鈥檚 AI research in areas like responsible AI, human-centred robotics, human-machine interaction, healthcare, economic sustainability and climate change. 探花直播donation is also funding students from underrepresented groups to carry out PhDs within the CHIA to help broaden diversity in the AI research community.聽</p>&#13; &#13; <p> 探花直播expanded partnership builds on years of collaboration between Google Research, Google DeepMind and the 探花直播 of Cambridge. Google <a href="http://research.google/programs-and-events/">provides funding for academic research</a>, facilitates collaboration between faculty and Google researchers, and supports exceptional computer science students through its <a href="http://research.google/programs-and-events/phd-fellowship/">PhD Fellowship Programme</a>. Google DeepMind <a href="https://www.student-funding.cam.ac.uk/fund/deepmind-scholarship-2022">funds scholarships for students from underrepresented backgrounds studying AI-related fields</a>, as well as a <a href="https://www.cst.cam.ac.uk/news/dr-peter-ochieng-appointed-first-deepmind-academic-fellow-computer-science-cambridge">postdoctoral Fellowship</a>, to help build a stronger and more inclusive AI community. Google DeepMind also <a href="/research/news/cambridge-appoints-first-deepmind-professor-of-machine-learning">endowed the first DeepMind Professor of Machine Learning</a> at the 探花直播 of Cambridge鈥檚 Department of Computer Science and Technology to help drive its machine learning and artificial intelligence research.</p>&#13; &#13; <p>Matt Brittin, President of Google EMEA and 探花直播 of Cambridge alumnus, commented: 鈥淎I has huge potential to benefit people across the world - whether it鈥檚 through making daily life that bit easier, or by tackling some of society鈥檚 biggest challenges. It鈥檚 vital that we work together to seize this opportunity. By collaborating with one of our world-leading British academic institutions, we can enable AI research that is bold, responsible and designed to meet the needs of people across the country. This partnership also reaffirms Google鈥檚 commitment to the UK as a global AI and technology leader.鈥</p>&#13; &#13; <p>Jessica Montgomery, Director of <a href="https://ai.cam.ac.uk/">ai@cam</a>, the 探花直播 of Cambridge鈥檚 flagship mission on artificial intelligence, commented: 鈥 探花直播 探花直播 of Cambridge can be an engine for AI innovation and a steward of advancements in this exciting field. Translating advances in AI to benefits for science, citizens, and society requires interdisciplinary research that is deeply connected to real-word needs. 探花直播research collaboration agreement announced today will support research activities across the 探花直播. We want to leverage the world-leading expertise found across the 探花直播 to enable exciting new advances in responsible AI.鈥</p>&#13; &#13; <p>Michelle Donelan, Secretary of State for Science, Innovation and Technology, added: "Artificial intelligence can offer us enormous opportunities - growing the economy, creating new jobs and making lives longer, healthier and happier for British people. To seize those opportunities, we must bring together insights from business and academia to encourage the safe and responsible development of AI. That is why we are welcoming the partnership which Google and the 探花直播 of Cambridge have announced today.</p>&#13; &#13; <p>As we prepare for next month's AI Safety Summit, this partnership shows that the UK - home to world-leading research facilities as well as some of the biggest tech companies in the world - is perfectly placed to support the innovation that underpins this critical technology."</p>&#13; &#13; <p>Professor Anna Korhonen, Director of CHIA, said: 鈥淗ere at the Centre for Human-Inspired Artificial Intelligence our researchers are dedicated to making sure that people are put at the very heart of new developments in AI. As our first funding partner, Google has been with us from the start of our journey, helping enable the breakthrough interdisciplinary research that we do. Partnerships like this 鈥 between academia and industry 鈥 will continue to be vital for the successful development of human-inspired AI.鈥</p>&#13; &#13; <p>Zoubin Ghahramani, VP, Research, Google DeepMind is a Professor of Information Engineering at the 探花直播 of Cambridge and has spearheaded this expanded partnership. He commented: 鈥淕oogle and the 探花直播 of Cambridge share a deep commitment to developing AI responsibly, which means grounding innovation in scientific research, 聽human values and our AI principles. We鈥檙e excited by CHIA鈥檚 potential to set new standards in responsible and human-centric AI development, 聽and unlock AI discoveries that could benefit everyone.鈥</p>&#13; &#13; <p><a href="https://blog.google/around-the-globe/google-europe/unlocking-the-ai-powered-opportunity-in-the-uk/">A recent report, commissioned by Google and compiled by Public First</a>, quantified the opportunity AI presents to enhance the lives and businesses of everyone across the UK. It found AI-powered innovation could create over 拢400 billion in economic value for the UK economy by 2030. To ensure everyone can tap into that potential, regardless of whether they鈥檙e in higher education, Google has launched free training to offer people and businesses practical skills and knowledge to capture the benefits of AI.聽<br />&#13; 聽</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> 探花直播 探花直播 of Cambridge and Google are building on their long-standing partnership with a multi-year research collaboration agreement and a Google grant for the 探花直播鈥檚 new Centre for Human-Inspired AI to support progress in responsible AI that is inspired by and benefits people.聽</p>&#13; </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.cambridgefilmworks.com/" target="_blank">Cambridge FilmWorks</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Research underway in the Centre for Human-Inspired Artificial Intelligence</div></div></div><div class="field field-name-field-panel-title field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">How AI can help people with motor disabilities 鈥 like my cousin</div></div></div><div class="field field-name-field-panel-body field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><img alt="" src="/sites/www.cam.ac.uk/files/inner-images/phd_student_aleesha_hamid_park_jacqueline_garget_resized.jpg" style="height: 137px; width: 150px; float: left; border-width: 0px; border-style: solid; margin-right: 10px; margin-left: 10px;" /><em>鈥淢y cousin was the victim of a brutal attack, and left with life-changing injuries 鈥 but with AI technology, we aim to empower people like her.鈥</em></p>&#13; &#13; <p style="padding-bottom: 50px;">Aleesha Hamid, a PhD student at the Cambridge Centre for Human-Inspired AI, <a href="https://blog.google/around-the-globe/google-europe/united-kingdom/ai-motor-disabilities/">blogs on the Google website</a> about why her research aims to make a real difference to people like her cousin, who was left with a traumatic brain injury and uses technology to communicate. 聽</p>&#13; </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; 探花直播text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright 漏 探花直播 of Cambridge and licensors/contributors as identified.聽 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-related-links field-type-link-field field-label-above"><div class="field-label">Related Links:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="https://www.bbc.co.uk/news/technology-67132846">BBC News</a></div><div class="field-item odd"><a href="https://www.independent.co.uk/news/uk/google-university-of-cambridge-cambridge-matt-brittin-b2430799.html">Independent</a></div><div class="field-item even"><a href="https://www.dailymail.co.uk/wires/pa/article-12638025/Google-help-fund-new-AI-research-centre- 探花直播-Cambridge.html">Daily Mail</a></div><div class="field-item odd"><a href="https://www.standard.co.uk/news/uk/google-university-of-cambridge-michelle-donelan-cambridge-matt-brittin-b1113953.html">Evening Standard</a></div><div class="field-item even"><a href="https://www.lbc.co.uk/tech/d068c171842542b180572a83f09d2863/">LBC</a></div><div class="field-item odd"><a href="https://www.businessweekly.co.uk/news/hi-tech/google-funding-new-cambridge-university-ai-hub-could-help-uk-grab-拢400bn-opportunity">Business Weekly</a></div><div class="field-item even"><a href="https://uk.news.yahoo.com/google-help-fund-ai-research-230100476.html">Yahoo News</a></div></div></div> Tue, 17 Oct 2023 07:54:22 +0000 hcf38 242681 at Cambridge Zero takes centre stage at Climate Week NYC /news/cambridge-zero-takes-centre-stage-at-climate-week-nyc <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/news/thomas-habr-6nmnrajpq7m-unsplash.jpg?itok=47QGabzU" alt="Photograph of New York skyline" title="Photograph of New York skyline, Credit: Photograph of New York skyline - Credit Thomas Habr UnSplash" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Professor Shuckburgh聽(Trinity and Darwin) is at聽<a href="https://www.climateweeknyc.org/">Climate Week NYC </a>聽(17-24 September)聽to discuss how to address the challenges the world faces in keeping global temperatures to 1.5 degrees Celsius. She聽was at the Opening Ceremony on Sunday 17 September alongside world government, business, science and policy leaders and appeared with top climate scientists in the opening video. You can view the Opening Ceremony and video live by registering for it <a href="https://www.climateweeknyc.org/climate-week-nyc-opening-ceremony">here</a>.</p>&#13; &#13; <p>Professor Shuckburgh will be appearing on Climate Week NYC鈥檚 Hub Live on Tuesday 19 September with Helen Clarkson (Corpus Christi 1993) Chief Executive Officer of the Climate Group which organises Climate Week, with Kate Brandt (Selwyn 2007) Chief Sustainability Officer of Google and Judith Weise Chief People &amp; Sustainability Officer of Siemens AG to discuss the innovation and investment needed to achieve net zero. You can register to view Tuesday鈥檚 panel online through this link: <a href="https://www.climateweeknyc.org/events/new-frontiers-climate-action-innovation-and-investment-needed-achieve-net-zero">New frontiers of Climate Action</a>.</p>&#13; &#13; <p>鈥淣ow is the time for Cambridge and the rest of the world to turn ambition into action in the race to accelerate the pace of a just transition to a net zero world and New York will be buzzing with the kinds of people who can make that happen,鈥 Professor Shuckburgh said. 聽聽聽</p>&#13; &#13; <p>Climate Week NYC takes place in partnership with the United Nations General Assembly and is run in coordination with the United Nations and the City of New York. It is the largest annual climate event of its kind, bringing together some 400 events and activities across the City of New York 鈥 in person, hybrid and online.</p>&#13; &#13; <p>This year it centres around the UN General Assembly, the UN Secretary-General鈥檚 Climate Ambition Summit as well as hundreds of national government, business and climate group initiatives, making it a unique opportunity for Cambridge to communicate with the world.</p>&#13; &#13; <p>On Wednesday evening, just hours after the UN Secretary-General鈥檚 Climate Ambition Summit is concluded at the nearby headquarters of the United Nations, Professor Shuckburgh will lead a discussion for alumni in New York, hosted by Cambridge in America at the Morgan Library, about the technological and behavioural solutions available to build a sustainable future for the whole planet.</p>&#13; &#13; <p>Professor Shuckburgh will be joined at the alumni event by Professor of Planetary Computing Anil Madhavapeddy聽(Pembroke) and Fiona Macklin (St John鈥檚 2012), Senior Adviser to Groundswell, a joint initiative between Bezos Earth Fund, Global Optimism and the Systems Change Lab. 探花直播panel will be chaired by Professor Matthew Connelly, the new Director of the Centre for the Study of Existential Risk at the 探花直播 of Cambridge.聽</p>&#13; &#13; <p>Book online here to see <a href="https://www.cantab.org/events/mission-possible-creating-a-better-planetary-future">Mission Possible: Creating a Better Planetary Future</a>.</p>&#13; &#13; <p>鈥淥ur alumni network is one of Cambridge鈥檚 greatest pillars of support and with their help the 探花直播 is able to amplify its work, linking one of the world鈥檚 top research universities to peer institutions, policymakers and business leaders,鈥 Professor Shuckburgh said. 聽聽</p>&#13; &#13; <p>Throughout the visit to New York, Cambridge Zero will seek to respond to news and relevant climate announcements with the help of an assembled Cambridge Climate Media team of academics at the 探花直播. 聽聽聽聽</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge Zero Director Professor Emily Shuckburgh聽takes centre stage at the world's biggest climate event of its kind in New York, talking to global leaders of government, business and philanthropy about Cambridge鈥檚 efforts to tackle climate change.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Now is the time for Cambridge and the rest of the world to turn ambition into action in the race to accelerate the pace of a just transition to a net zero world </div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Prof Emily Shuckburgh</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Photograph of New York skyline - Credit Thomas Habr UnSplash</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Photograph of New York skyline</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; 探花直播text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright 漏 探花直播 of Cambridge and licensors/contributors as identified.聽 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/public-domain">Public Domain</a></div></div></div> Fri, 15 Sep 2023 13:39:31 +0000 plc32 241751 at How to 'inoculate' millions against misinformation on social media /stories/inoculateexperiment <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Briefly exposing social media users to tricks behind misinformation boosts awareness of dangerous online falsehoods 鈥 even amid intense 鈥榥oise鈥 of world鈥檚 second-most visited website.</p> </p></div></div></div> Wed, 24 Aug 2022 18:05:07 +0000 fpjl2 233881 at Making the digital world a safer place /stories/improving-computer-security <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>New technology developed by Cambridge researchers and Arm to make our computers more secure is being put through its paces by tech companies in the UK and around the world.聽</p> </p></div></div></div> Wed, 25 May 2022 09:49:36 +0000 skbf2 232371 at Cambridge 探花直播 Library joins Google Arts and Culture /stories/university-library-on-google-arts <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge 探花直播 Library partners with Google Arts and Culture to open its world-class collections freely to a global audience becoming the first 探花直播 of Cambridge institution to join the platform.聽</p> </p></div></div></div> Mon, 31 Aug 2020 08:00:42 +0000 zs332 217432 at Let鈥檚 get statted /research/features/lets-get-statted <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/150603automatic-statitician.jpg?itok=Xd_7Dc2L" alt="" title="Credit: Automatic Statistician" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>鈥淚 keep saying that the sexy job in the next 10 years will be statisticians, and I鈥檓 not kidding,鈥 Hal Varian, Chief Economist at Google famously observed in 2009. It seems a difficult assertion to take seriously, but six years on, there is little question that their skills are at a premium.</p>&#13; &#13; <p>Indeed, we may need statisticians now more than at any time in our history. Even compared with a decade ago, we can now gather, produce and consume unimaginably large quantities of information. As Varian predicted, statisticians who can crunch these numbers are all the rage. A new discipline, 鈥楧ata Science鈥, which fuses statistics and computational work, has emerged.</p>&#13; &#13; <p>鈥淧eople are awash in data,鈥 reflects Zoubin Ghahramani, Professor of Information Engineering at Cambridge. 鈥淭his is occurring across industry, it鈥檚 changing society as we become more digitally connected, and it鈥檚 true of the sciences as well, where fields like biology and astronomy generate vast amounts of data.鈥</p>&#13; &#13; <p>Over the past few years, Richard Samworth, Professor of Statistics, has watched the datarati step out from the shadows. 鈥淚t鈥檚 probably fair to say that statistics didn鈥檛 have the world鈥檚 best PR for quite a long time,鈥 he says. 鈥淪ince this explosion in the amount of data that we can collect and store, opportunities have arisen to answer questions we previously had no hope of being able to address. These demand an awful lot of new statistical techniques.鈥</p>&#13; &#13; <p>鈥楤ig data鈥 is most obviously relevant to the sciences, where large volumes of information are gathered to answer questions in fields such as genetics, astronomy and particle physics, but it also has more familiar applications. Transport authorities gather data from electronic ticketing systems like Oyster cards to understand more about passenger movements; supermarkets closely monitor customer transactions to react to shoppers鈥 predilections. As users of social media, many of us disclose data about ourselves that is as valuable to marketing as it is relevant to psychoanalytics. Increasingly, we are also 鈥榣ifeloggers鈥, monitoring our own behaviour, health, diet and fitness, through smart technology.</p>&#13; &#13; <p>This information, as Ghahramani points out, is no use on its own: 鈥淚t fills hard drives, but to extract value from it, we need methods that learn patterns in the data and allow us to make predictions and intelligent decisions.鈥 This is what statisticians, computer scientists and machine learning specialists bring to the party 鈥 they build algorithms, which are coded as computer software, to see patterns. At root, the datarati are interpreters.</p>&#13; &#13; <p>Despite their 鈥榮exy鈥 new image, however, not enough data scientists exist to meet this rocketing demand. Could some aspects of the interpretation be automated using artificial intelligence instead, Ghahramani wondered? And so, in 2014 and with funding from Google, the first incarnation of 探花直播Automatic Statistician was launched online. Despite minimal publicity, 3,000 users uploaded datasets to it within a few months.</p>&#13; &#13; <p><img alt="" src="/sites/www.cam.ac.uk/files/inner-images/150603_zoubin-and-richard.jpg" style="width: 590px; height: 260px; float: left;" /></p>&#13; &#13; <p>Once fed a dataset, the Automatic Statistician assesses it against various statistical models, interprets the data and 鈥 uniquely 鈥 translates this interpretation into a short report of readable English. It does this without human intervention, drawing on an open-ended 鈥榞rammar鈥 of statistical models. It is also deliberately conservative, only basing its assessments on sound statistical methodology, and even critiquing its own approach.</p>&#13; &#13; <p>Ghahramani and his team are now refining the system to cope with the messy, incomplete nature of real-world data, and also plan to develop its base of knowledge and to offer interactive reports. In the longer term, they hope that the Automatic Statistician will learn from its own work: 鈥 探花直播idea is that it will look at a new dataset and say, 鈥楢h, I鈥檝e seen this kind of thing before, so maybe I should check the model I used last time鈥,鈥 he explains.</p>&#13; &#13; <p>While automated systems rely on existing models, new algorithms are needed to extract useful information from evolving and expanding datasets. Here, the role of human statisticians is vital.</p>&#13; &#13; <p>To characterise the problem, Samworth presents a then-and-now comparison. During the past century, a typical statistical problem might, for instance, have been to understand the relationship between the initial speed and stopping distance of cars based on a sample size of 50.</p>&#13; &#13; <p>These days, however, we can record information on a huge number of variables at once 鈥 the weather, road surface, make of car, wind direction, and so on. Although the extra information has the potential to yield better models and reduce uncertainty, in many areas, the number of features measured is so high it may even exceed the number of observations. Identifying appropriate models in this context is a serious challenge, which requires the development of new algorithms.</p>&#13; &#13; <p>To resolve this, statisticians rely on a principle called 鈥榮parsity鈥; the idea that only a few bits of the dataset are really important. 探花直播statistician identifies these needles in the haystack. Various algorithms have been developed to select the important variables, so that the initial sprawl of information starts to become manageable and patterns can be extracted.</p>&#13; &#13; <p>Together with his colleague Dr Rajen Shah in the Department of Pure Mathematics and Mathematical Statistics, Samworth has developed a method for refining any such variable selection technique called 鈥楥omplementary Pairs Stability Selection鈥. This applies the original method to random subsamples of the data instead of the whole, and does this over and over again. Eventually, the variables that appear on a high proportion of the subsamples emerge as those meriting further attention.</p>&#13; &#13; <p>Scanning Google Scholar for citations of the paper in which this was proposed, Samworth finds that his algorithm has been used in numerous research projects. One looks at how to improve fundraising for disaster zones, another examines potential biomarkers for breast cancer survival, and a third identifies risk factors connected with childhood malnutrition.</p>&#13; &#13; <p>How does he feel when he sees his work being applied so far and wide? 鈥淚t鈥檚 funny,鈥 he says. 鈥淢y training is in mathematics and I still get a kick from proving a theorem, but it鈥檚 also rewarding to see people using your work. It鈥檚 often said that the good thing about being a statistician is that you get to play in everyone鈥檚 back yard. I suppose this demonstrates why that鈥檚 true.鈥</p>&#13; &#13; <p><em>Inset image: left to right, Zoubin Ghahramani and Richard Samworth</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>With more information than ever at our fingertips, statisticians are vital to innumerable fields and industries. Welcome to the world of the datarati, where humans and machines team up to crunch the numbers.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">It fills hard drives, but to extract value from it, we need methods that learn patterns in the data and allow us to make predictions and intelligent decisions</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Zoubin Ghahramani</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.automaticstatistician.com/" target="_blank">Automatic Statistician</a></div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; 探花直播text in this work is licensed under a <a href="https://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-related-links field-type-link-field field-label-above"><div class="field-label">Related Links:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="https://www.automaticstatistician.com/">Automatic Statistician</a></div></div></div> Wed, 03 Jun 2015 14:13:27 +0000 tdk25 152612 at Sports calibrated /research/features/sports-calibrated <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/150206britishcycling.jpg?itok=i0vvoblv" alt=" 探花直播view from the top of the stands of Lee Valley VeloPark, London." title=" 探花直播view from the top of the stands of Lee Valley VeloPark, London., Credit: British Cycling" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> 探花直播bat makes contact with the ball; the ball flies back, back, back; and a thousand mobile phones capture it live as the ball soars over the fence and into the cheering crowd. Baseball is America鈥檚 pastime and, as for many other spectator sports, mobile phones have had a huge effect on the experience of spending an afternoon at the ballpark.</p> <p>But what to do with that video of a monster home run or a spectacular diving catch once the game is over? What did that same moment look like from the other end of the stadium? How many other people filmed exactly the same thing but from different vantage points? Could something useful be saved from what would otherwise be simply a sporting memory?</p> <p>Dr Joan Lasenby of the 探花直播 of Cambridge鈥檚 Department of Engineering has been working on ways of gathering quantitative information from video, and thanks to an ongoing partnership with Google, a new method of digitally 鈥榬econstructing鈥 shared experiences such as sport or concerts is being explored at YouTube.</p> <p> 探花直播goal is for users to upload their videos in collaboration with the event coordinator, and a 鈥榗loud鈥-based system will identify where in the space the video was taken from, creating a 鈥榤ap鈥 of different cameras from all over the stadium. 探花直播user can then choose which camera they want to watch, allowing them to experience the same event from dozens or even hundreds of different angles.<img alt="" src="/sites/www.cam.ac.uk/files/inner-images/150206-british-cycling-3.gif" style="width: 250px; height: 250px; float: right;" /></p> <p>But although stitching together still images is reasonably straightforward, doing the same thing with video, especially when the distance between cameras can be on a scale as massive as a sports stadium, is much more difficult. 鈥淭here鈥檚 a lot of information attached to the still images we take on our phones or cameras, such as the type of camera, the resolution, the focus, and so on,鈥 said Lasenby. 鈥淏ut the videos we upload from our phones have none of that information attached, so patching them together is much more difficult.鈥</p> <p>Using a series of videos taken on mobile phones during a baseball game, the researchers developed a method of using visual information contained in the videos, such as a specific advertisement or other distinctive static features of the stadium, as a sort of 鈥榓nchor鈥 which enables the video鈥檚 location to be pinpointed.</p> <p>鈥淎nother problem we had to look at was a way to separate the good frames from the bad,鈥 said Dr Stuart Bennett, a postdoctoral researcher in Lasenby鈥檚 group who developed this new method of three-dimensional reconstruction while a PhD student. 鈥淲ith the videos you take on your phone, usually you鈥檙e not paying attention to the quality of each frame as you would with a still image. We had to develop a way of efficiently, and automatically, choosing the best frames and deleting the rest.鈥</p> <p>To identify where each frame originated from in the space, the technology selects the best frames automatically via measures of sharpness and edge or corner content and then selects those which match. 探花直播system works with as few as two cameras, and the team has tested it with as many as ten. YouTube has been stress testing it further, expecting that the technology has the potential to improve fan engagement in the sports and music entertainment sectors.</p> <p>Although the technology is primarily intended for use in an entertainment context, Lasenby points out it could potentially be applied for surveillance purposes as well. 鈥淚t is a possible application down the road, and could one day be used by law enforcement to help provide information at the crime scene,鈥 said Lasenby. 鈥淎t the moment, a lot of surveillance is done with fixed cameras, and you know everything about the camera. But this sort of technology might be able to give you information about what鈥檚 going on in a particular video shot on a phone by making locations in that video identifiable.鈥</p> <div> <br /> <p>Another area where Lasenby鈥檚 group is extracting quantitative data from video is in their partnership with British Cycling. Over the past decade, the UK has become a dominant force in international cycling, thanks to the quality of its riders and equipment, its partnerships with industry and academia, and its use of technology to help improve speeds on the track and on the road.</p> <img alt="" src="/sites/www.cam.ac.uk/files/inner-images/150206-british-cycling-4.gif" style="width: 250px; height: 250px; float: right;" /> <p>鈥淚n sport, taking qualitative videos and photographs is commonplace, which is extremely useful, as athletes aren鈥檛 robots,鈥 said Professor Tony Purnell, Head of Technical Development for the Great Britain Cycling Team and Royal Academy of Engineering Visiting Professor at Cambridge. 鈥淏ut what we wanted was to start using image processing not just to gather qualitative information, but to get some good quantitative data as well.鈥</p> <p>Currently, elite cyclists are filmed on a turbo trainer, which is essentially a stationary bicycle in a lab or in a wind tunnel. 探花直播resulting videos are then assessed to improve aerodynamics or help prevent injuries. 鈥淏ut for cyclists, especially sprinters, sitting on a constrained machine just isn鈥檛 realistic,鈥 said Lasenby. 鈥淲hen you look at a sprinter on a track, they鈥檙e throwing their bikes all over the place to get even the tiniest advantage. So we thought that if we could get quantitative data from video of them actually competing, it would be much more valuable than anything we got from a stationary turbo trainer.鈥</p> <p>To obtain this sort of data, the researchers utilised the same techniques as are used in the gaming industry, where markers are used to obtain quantitative information about what鈥檚 happening 鈥 similar to the team鈥檚 work with Google.</p> <p>One thing that simplifies the gathering of quantitative information from these videos is the ability to 鈥榮ubtract鈥 the background, so that only the athlete remains. But doing this is no easy task, especially as the boards of the velodrome and the legs of the cyclist are close to the same colour. Additionally, things that might appear minor to the human eye, such as shadows or changes in the light, make the maths of doing this type of subtraction extremely complicated. Working with undergraduate students, graduate students and postdoctoral researchers, however, Lasenby鈥檚 team has managed to develop real-time subtraction methods to extract the data that may give the British team the edge as they prepare for the Rio Olympics in 2016.</p> <p>鈥淭echnology is massively important in sport,鈥 said Lasenby. 鈥 探花直播techniques we鈥檙e developing here are helping to advance how we experience sport, both as athletes and as fans.鈥</p> <p><em>Inset images: credit British Cycling</em></p> </div> <p>聽</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>New methods of gathering quantitative data from video 鈥 whether shot on a mobile phone or an ultra-high definition camera 鈥 may change the way that sport is experienced, for athletes and fans alike.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"> 探花直播techniques we鈥檙e developing here are helping to advance how we experience sport, both as athletes and as fans</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Dr Joan Lasenby</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.flickr.com/photos/britishcycling/15759116409/" target="_blank">British Cycling</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even"> 探花直播view from the top of the stands of Lee Valley VeloPark, London.</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p> 探花直播text in this work is licensed under a <a href="http://creativecommons.org/licenses/by-nc-sa/3.0/">Creative Commons Licence</a>. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.</p> <p><a href="http://creativecommons.org/licenses/by-nc-sa/3.0/"><img alt="" src="/sites/www.cam.ac.uk/files/80x15.png" style="width: 80px; height: 15px;" /></a></p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 06 Feb 2015 15:08:06 +0000 sc604 145001 at Computing for the Future of the Planet wins Google funding /research/news/computing-for-the-future-of-the-planet-wins-google-funding <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/comlabphoto.gif?itok=pgOj_NAG" alt="Computing for the Future of the Planet" title="Computing for the Future of the Planet, Credit: Professor Andy Hopper" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><div>&#13; <div>&#13; <p>In the first-ever round of Google Focused Research Awards, Professor Andy Hopper, Head of the Computer Laboratory, has been awarded funding towards Computing for the Future of the Planet, a research programme that has set its sights on what computing can do for the environment.</p>&#13; <p>Explaining the ethos behind the research he leads, Professor Hopper said: 鈥楥omputing has had an enormous impact on the way we live and work, and a natural extension is to harness its power to solve problems facing the planet, whether it鈥檚 energy consumption, pollution, congestion or sustainable living. Over the past five years, we have developed a strong interdisciplinary vision of a computer-based framework that will improve the way we live, and have been building and testing the deep-engineering technology needed to achieve it.鈥</p>&#13; <p>Computing for the Future of the Planet has several goals: an optimal digital infrastructure, sensing and optimising with a global world model, reliably predicting and reacting to our environment, and digital alternatives to physical activities.</p>&#13; <p>One research area under investigation is a personal energy meter that would enable individuals to calculate their energy use in real-time. Pulling information together from a variety of sources, the meter would calculate not only the energy being used directly by the user but also the shared energy use of, for example, the buildings they work in, public transport and even national overheads like healthcare.</p>&#13; <p>Google has awarded a total of $5.7 million to 13 projects through their recent Research Awards scheme, with Cambridge being the only institution outside the USA to win funding. 鈥楻efreshingly, the gift from Google places no restrictions on the funded research,鈥 said Professor Hopper. 鈥 探花直播nature of the award is simply to stimulate and accelerate the development of new ideas and practical solutions in these innovative areas.鈥</p>&#13; </div>&#13; <div>&#13; <p>For more information, please contact Professor Andy Hopper (<a href="mailto:ah12@cam.ac.uk">ah12@cam.ac.uk</a>).</p>&#13; </div>&#13; </div>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A gift from Google will help Computing for the Future of the Planet.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Computing has had an enormous impact on the way we live and work, and a natural extension is to harness its power to solve problems facing the planet, whether it鈥檚 energy consumption, pollution, congestion or sustainable living.</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Professor Andy Hopper</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Professor Andy Hopper</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Computing for the Future of the Planet</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by-nc-sa/3.0/"><img alt="" src="/sites/www.cam.ac.uk/files/80x15.png" style="width: 80px; height: 15px;" /></a></p>&#13; <p>This work is licensed under a <a href="http://creativecommons.org/licenses/by-nc-sa/3.0/">Creative Commons Licence</a>. If you use this content on your site please link back to this page.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Sat, 01 May 2010 09:13:48 +0000 lw355 26004 at