ֱ̽ of Cambridge - research /taxonomy/subjects/research en New datasets will train AI models to think like scientists /research/news/new-datasets-will-train-ai-models-to-think-like-scientists <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/polymathic-ai.jpg?itok=J6Vf_9mh" alt="A mosaic of simulations included in the Well collection of datasets" title="A mosaic of simulations included in the Well collection of datasets, Credit: Alex Meng, Aaron Watters and the Well Collaboration" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽initiative, called <a href="https://polymathic-ai.org/">Polymathic AI</a>, uses technology like that powering large language models such as OpenAI’s ChatGPT or Google’s Gemini. But instead of ingesting text, the project’s models learn using scientific datasets from across astrophysics, biology, acoustics, chemistry, fluid dynamics and more, essentially giving the models cross-disciplinary scientific knowledge.</p> <p>“These datasets are by far the most diverse large-scale collections of high-quality data for machine learning training ever assembled for these fields,” said team member Michael McCabe from the Flatiron Institute in New York City. “Curating these datasets is a critical step in creating multidisciplinary AI models that will enable new discoveries about our universe.”</p> <p>On 2 December, the Polymathic AI team released two of its open-source training dataset collections to the public — a colossal 115 terabytes, from dozens of sources — for the scientific community to use to train AI models and enable new scientific discoveries. For comparison, GPT-3 used 45 terabytes of uncompressed, unformatted text for training, which ended up being around 0.5 terabytes after filtering.</p> <p> ֱ̽full datasets are available to download for free on <a href="https://huggingface.co/">HuggingFace</a>, a platform hosting AI models and datasets. ֱ̽Polymathic AI team provides further information about the datasets in <a href="https://nips.cc/virtual/2024/poster/97882">two</a> <a href="https://nips.cc/virtual/2024/poster/97791">papers</a> accepted for presentation at the <a href="https://neurips.cc/">NeurIPS</a> machine learning conference, to be held later this month in Vancouver, Canada.</p> <p>“Just as LLMs such as ChatGPT learn to use common grammatical structure across languages, these new scientific foundation models might reveal deep connections across disciplines that we’ve never noticed before,” said Cambridge team lead <a href="https://astroautomata.com/">Dr Miles Cranmer</a> from Cambridge’s Institute of Astronomy. “We might uncover patterns that no human can see, simply because no one has ever had both this breadth of scientific knowledge and the ability to compress it into a single framework.”</p> <p>AI tools such as machine learning are increasingly common in scientific research, and were recognised in two of this year’s <a href="/research/news/university-of-cambridge-alumnus-awarded-2024-nobel-prize-in-physics">Nobel</a> <a href="/research/news/university-of-cambridge-alumni-awarded-2024-nobel-prize-in-chemistry">Prizes</a>. Still, such tools are typically purpose-built for a specific application and trained using data from that field. ֱ̽Polymathic AI project instead aims to develop models that are truly polymathic, like people whose expert knowledge spans multiple areas. ֱ̽project’s team reflects intellectual diversity, with physicists, astrophysicists, mathematicians, computer scientists and neuroscientists.</p> <p> ֱ̽first of the two new training dataset collections focuses on astrophysics. Dubbed the Multimodal Universe, the dataset contains hundreds of millions of astronomical observations and measurements, such as portraits of galaxies taken by NASA’s James Webb Space Telescope and measurements of our galaxy’s stars made by the European Space Agency’s Gaia spacecraft.</p> <p> ֱ̽other collection — called the Well — comprises over 15 terabytes of data from 16 diverse datasets. These datasets contain numerical simulations of biological systems, fluid dynamics, acoustic scattering, supernova explosions and other complicated processes. Cambridge researchers played a major role in developing both dataset collections, working alongside PolymathicAI and other international collaborators.</p> <p>While these diverse datasets may seem disconnected at first, they all require the modelling of mathematical equations called partial differential equations. Such equations pop up in problems related to everything from quantum mechanics to embryo development and can be incredibly difficult to solve, even for supercomputers. One of the goals of the Well is to enable AI models to churn out approximate solutions to these equations quickly and accurately.</p> <p>“By uniting these rich datasets, we can drive advancements in artificial intelligence not only for scientific discovery, but also for addressing similar problems in everyday life,” said Ben Boyd, PhD student in the Institute of Astronomy.</p> <p>Gathering the data for those datasets posed a challenge, said team member Ruben Ohana from the Flatiron Institute. ֱ̽team collaborated with scientists to gather and create data for the project. “ ֱ̽creators of numerical simulations are sometimes sceptical of machine learning because of all the hype, but they’re curious about it and how it can benefit their research and accelerate scientific discovery,” he said.</p> <p> ֱ̽Polymathic AI team is now using the datasets to train AI models. In the coming months, they will deploy these models on various tasks to see how successful these well-rounded, well-trained AIs are at tackling complex scientific problems.</p> <p>“It will be exciting to see if the complexity of these datasets can push AI models to go beyond merely recognising patterns, encouraging them to reason and generalise across scientific domains,” said Dr Payel Mukhopadhyay from the Institute of Astronomy. “Such generalisation is essential if we ever want to build AI models that can truly assist in conducting meaningful science.”</p> <p>“Until now, haven’t had a curated scientific-quality dataset cover such a wide variety of fields,” said Cranmer, who is also a member of Cambridge’s Department of Applied Mathematics and Theoretical Physics. “These datasets are opening the door to true generalist scientific foundation models for the first time. What new scientific principles might we discover? We're about to find out, and that's incredibly exciting.”</p> <p> ֱ̽Polymathic AI project is run by researchers from the Simons Foundation and its Flatiron Institute, New York ֱ̽, the ֱ̽ of Cambridge, Princeton ֱ̽, the French Centre National de la Recherche Scientifique and the Lawrence Berkeley National Laboratory.</p> <p>Members of the Polymathic AI team from the ֱ̽ of Cambridge include PhD students, postdoctoral researchers and faculty across four departments: the Department of Applied Mathematics and Theoretical Physics, the Department of Pure Mathematics and Mathematical Statistics, the Institute of Astronomy and the Kavli Institute for Cosmology.</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>What can exploding stars teach us about how blood flows through an artery? Or swimming bacteria about how the ocean’s layers mix? A collaboration of researchers, including from the ֱ̽ of Cambridge, has reached a milestone toward training artificial intelligence models to find and use transferable knowledge between fields to drive scientific discovery.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://polymathic-ai.org/" target="_blank">Alex Meng, Aaron Watters and the Well Collaboration</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">A mosaic of simulations included in the Well collection of datasets</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 02 Dec 2024 15:59:08 +0000 sc604 248583 at New long-term collaboration with Suzano begins with a £10 million donation to support conservation and sustainability education and research /news/new-long-term-collaboration-with-suzano-begins-with-a-ps10-million-donation-to-support-conservation <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/news/mosaico.jpg?itok=Cqw8W0uH" alt="Image of a forest" title="Green forests stretch out to the horizon, Credit: Suzano" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>An initial £10 million donation will be used to support education and research into areas including the conservation of biodiversity, enhancing business sustainability, and the restoration of natural habitats in Brazil and beyond. ֱ̽agreement will establish the Suzano Scholars Fund, a perpetual endowment at Jesus College to fund Brazilian nationals studying for a postgraduate degree at the ֱ̽ of Cambridge connected to the environment, ecology and conservation, educating the next generation of sustainability experts and leaders. Funding will also be provided to academics based at the Conservation Research Institute to undertake research projects exploring the interaction between human and natural systems in areas such as biodiversity, climate change, water resource management, and ecosystem restoration. Read more about this new initiative <a href="https://www.philanthropy.cam.ac.uk/impact-of-giving/gift-announcements/new-collaboration-with-suzano">here</a></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Suzano, one the world’s largest producers of bio-based raw materials, based in São Paulo, Brazil, establishes a long-term initiative with Jesus College and the ֱ̽ of Cambridge. </p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">This visionary initiative will help to build strong links between the ֱ̽ of Cambridge and Brazil</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Professor Bhaskar Vira </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Suzano</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Green forests stretch out to the horizon</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 15 Nov 2024 16:05:37 +0000 plc32 248555 at Early career researchers win major European funding /research/news/early-career-researchers-win-major-european-funding <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/plant-roots-interacting-with-arbuscular-mycorrhizal-fungi-credit-luginbuehl-lab.jpg?itok=MfvJK7-6" alt="Plant roots interacting with arbuscular mycorrhizal fungi. Image: Luginbuehl lab" title="Plant roots interacting with arbuscular mycorrhizal fungi. Image: Luginbuehl lab, Credit: Luginbuehl lab" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Of 3,500 proposals reviewed by the ERC, only 14% were selected for funding – Cambridge has the highest number of grants of any UK institution.</p> <p>ERC Starting Grants – totalling nearly €780 million – support cutting-edge research in a wide range of fields, from life sciences and physics to social sciences and humanities.</p> <p> ֱ̽awards help researchers at the beginning of their careers to launch their own projects, form their teams and pursue their most promising ideas. Starting Grants amount to €1.5 million per grant for a period of five years but additional funds can be made available.</p> <p>In total, the grants are estimated to create 3,160 jobs for postdoctoral fellows, PhD students and other staff at host institutions.</p> <p>Cambridge’s recipients work in a wide range of fields including plant sciences, mathematics and medicine. They are among 494 laureates who will be leading projects at universities and research centres in 24 EU Member States and associated countries. This year, the UK has received grants for 50 projects, Germany 98, France 49, and the Netherlands 51.</p> <h3><strong>Cambridge’s grant recipients for 2024 are:</strong></h3> <p><strong>Adrian Baez-Ortega</strong> (Dept. of Veterinary Medicine, Wellcome Sanger Institute) for Exploring the mechanisms of long-term tumour evolution and genomic instability in marine transmissible cancers</p> <p><strong>Claudia Bonfio</strong> (MRC Laboratory of Molecular Biology) for Lipid Diversity at the Onset of Life</p> <p><strong>Tom Gur</strong> (Dept. of Computer Science and Technology) for Sublinear Quantum Computation</p> <p><strong>Leonie Luginbuehl </strong>(Dept. of Plant Sciences) for Harnessing mechanisms for plant carbon delivery to symbiotic soil fungi for sustainable food production</p> <p><strong>Julian Sahasrabudhe</strong> (Dept. of Pure Mathematics and Mathematical Statistics) for High Dimensional Probability and Combinatorics</p> <p><strong>Richard Timms</strong> (Cambridge Institute for Therapeutic Immunology and Infectious Disease) for Deciphering the regulatory logic of the ubiquitin system</p> <p><strong>Hannah Übler</strong> (Dept. of Physics) for Active galactic nuclei and Population III stars in early galaxies</p> <p><strong>Julian Willis</strong> (Yusuf Hamied Department of Chemistry) for Studying viral protein-primed DNA replication to develop new gene editing technologies</p> <p><strong>Federica Gigante</strong> (Faculty of History) for Unveiling Networks: Slavery and the European Encounter with Islamic Material Culture (1580– 1700) – Grant hosted by the ֱ̽ of Oxford</p> <p> </p> <p>Professor Sir John Aston FRS, Pro-Vice-Chancellor for Research at the ֱ̽ of Cambridge, said:</p> <p>“Many congratulations to the recipients of these awards which reflect the innovation and the vision of these outstanding investigators. We are fortunate to have many exceptional young researchers across a wide range of disciplines here in Cambridge and awards such as these highlight some of the amazing research taking place across the university. I wish this year’s recipients all the very best as they begin their new programmes and can’t wait to see the outcomes of their work.”</p> <p>Iliana Ivanova, European Commissioner for Innovation, Research, Culture, Education and Youth, said:</p> <p>“ ֱ̽European Commission is proud to support the curiosity and passion of our early-career talent under our Horizon Europe programme. ֱ̽new ERC Starting Grants winners aim to deepen our understanding of the world. Their creativity is vital to finding solutions to some of the most pressing societal challenges. In this call, I am happy to see one of the highest shares of female grantees to date, a trend that I hope will continue. Congratulations to all!”</p> <p>President of the European Research Council, Prof. Maria Leptin, said:</p> <p>“Empowering researchers early on in their careers is at the heart of the mission of the ERC. I am particularly pleased to welcome UK researchers back to the ERC. They have been sorely missed over the past years. With fifty grants awarded to researchers based in the UK, this influx is good for the research community overall.”</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Nine Cambridge researchers are among the latest recipients of highly competitive and prestigious European Research Council (ERC) Starting Grants.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.luginbuehllab.com/" target="_blank">Luginbuehl lab</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Plant roots interacting with arbuscular mycorrhizal fungi. Image: Luginbuehl lab</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution">Attribution</a></div></div></div> Thu, 05 Sep 2024 09:30:00 +0000 ta385 247641 at AI and scholarship: a manifesto /stories/ai-and-scholarship-manifesto <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Two leading academics from the ֱ̽'s School of Humanities and Social Sciences provide a framework that supports scholars and students in navigating generative AI.</p> </p></div></div></div> Fri, 15 Mar 2024 10:39:00 +0000 fpjl2 245171 at Cambridge ReseARch Trail /stories/cambridge-ar-trail <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A new augmented reality trail, launched as part of the Cambridge Festival, is showcasing the world leading research of the ֱ̽ of Cambridge in a new light.</p> </p></div></div></div> Thu, 14 Mar 2024 16:34:06 +0000 zs332 245141 at Butterflies, Bees and Broader Horizons /stories/cambridge-zero-future-leaders <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge Zero programme prepares young climate leaders for the future</p> </p></div></div></div> Fri, 17 Nov 2023 16:22:18 +0000 plc32 243321 at “Write fewer papers, take more risks”: researchers call for ‘rebellion’ /research/news/write-fewer-papers-take-more-risks-researchers-call-for-rebellion <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/joakimb885by428_0.jpg?itok=33wnmFbS" alt=" Student performing on rigging at the ֱ̽ of Stockholm’s ‘Department of Circus’, which explores different disciplines through circus arts." title=" Student performing on rigging at the ֱ̽ of Stockholm’s ‘Department of Circus’, which explores different disciplines through circus arts., Credit: Joakim Björklund" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽appeal is the starting point for a new book which questions prevailing orthodoxies in academia. Its editors, who are four academics based in Britain and Australia, invite university staff to “rise up and rebel” against these conventions. They attack the assumption that the main output of research should be papers for scholarly journals, describing this as the “boring stuff” of their profession, which often undermines its quality and public value.</p> <p>Instead, the book calls for more university researchers to “depart radically” from traditional modes of academic production and combine forces with organisations beyond the ‘academy’, “to do the radical kind of work that the world needs right now, in a time of climate change, the COVID-19 pandemic, and rising nationalism and populism.”</p> <p>It examines, in particular, how this could be achieved through the arts. In a wide-ranging survey, different contributors cite examples of how academics have used creative writing, poetry, podcasts, music – and less obvious media including circus arts and magic – both to communicate their work, and as research tools.</p> <p> ֱ̽book, <em>Doing Rebellious Research in and beyond the Academy</em>, has been co-written by social scientists, critical theorists and performing artists. It argues that although universities often claim to be interdisciplinary, many academics still work in silos – rarely collaborating with colleagues, let alone beyond their institutions.</p> <p>It adds that this is often a consequence of convention and not intention, and that rather than being inherently remote and ‘stuffy’, as cliché might have it, many academics are under constant pressure to publish in specialist journals. ֱ̽volume itself is an anthology of “creative essays” exemplifying alternative ways to present research: as creative writing, poetry and art.</p> <p>Pamela Burnard, one of the co-editors and a Professor of Arts, Creativities and Educations at the Faculty of Education, ֱ̽ of Cambridge, said: “Universities are meant to exist for everyone’s benefit. It’s bizarre that their main research output is complex, esoteric writing that only a few other academics read or understand.”</p> <p>“Nobody is claiming that academic writing is pointless, but why is it the norm? If we want research to address the biggest challenges facing society, we need academics to have the confidence – in a sense the permission – to depart radically from it. We need to be braver and take more risks with what we do.”</p> <p>In the book’s prologue, the editors quote a similar point made by the anthropologist, Mary Pratt, in 1988: “How could such interesting people, doing such interesting things, produce such dull books?”</p> <p>They argue the arts provide alternative modes of expression that give non-academics better opportunities to connect meaningfully with academic ideas. They also suggest that when used as part of the research process, the arts give academics a means to ‘live’ and ‘experience’ their research as something creative and engaging. This often enables them to see the work differently and innovate further. ֱ̽book provides numerous examples of how this has been done by researchers around the world, using forms such as dance, the visual arts, poetry, hip-hop and podcasting.</p> <p>One example is the <a href="https://education.uq.edu.au/event/session/3412">‘Departing Radically in Academic Writing’</a> programme in Australia, which trains postgraduate students not just to turn their research into creative writing, but to use it as a research method. Its methods include ‘thesis drabbling’, in which students summarise their thesis as 100 words of stream-of-consciousness prose. Students say this has helped them to make their work “more human”, focus on its real purpose, and reconnect emotionally with why they wanted to do research in the first place.</p> <p>Elsewhere, the book presents the recent case of a ֱ̽ of Cambridge student who used <a href="/stories/UE-Simone-Eringfeld">podcasting</a> to collect data from students and staff for a study about how COVID-19 affected university life. It explains how the project stemmed partly from a dance workshop and ended with her releasing an <a href="https://open.spotify.com/album/1TGTc2RIFUqvIlvGy9pRhx">electronica and spoken word album</a> featuring performed fragments of the interviews on Spotify, to convey the fears and anxieties experienced on campuses during lockdown.</p> <p>In a separate chapter a psychologist discusses how she used <a href="https://research.brighton.ac.uk/en/publications/beyond-disciplines-spoken-word-as-participatory-arts-based-resear">slam poetry and spoken word art</a> to get marginalised young people to open up about their experiences of social injustice. She concludes that poetry can be used to challenge established “notions of what research and knowledge look like.”</p> <p>This book also touches on even more offbeat artforms. One chapter, for example, reports on the Stockholm ֱ̽ of the Arts <a href="https://www.uniarts.se/english/about-uniarts/department-of-circus">‘Department of Circus’</a>. This trains circus performers but has also used the unexpected realm of circus arts, and their capacity to test the extremes of human ability and self-control, to undertake studies into issues such as teamwork and collaboration in <a href="https://www.uniarts.se/forskning-utvecklingsarbete/forskningsprojekt/sprangbradans-akrobatik">high-risk environments</a>.</p> <p>In similar vein, a chapter co-authored by a medic, an award-winning biomechanics researcher, and an illusionist and escapologist, write about  how the <a href="https://www.magicacademy.co.uk/about-us/">Academy of Magic &amp; Science</a> has created ‘magic shows’ which introduce audiences to transdisciplinary practices and ideas connecting diverse fields such as engineering, chemistry, electronics, physiology, psychology and performance cultures. ֱ̽co-authors argue that the careful structuring of magic acts, to provoke curiosity and surprise, could be applied more widely in scientific writing. They suggest that presenting research as an illusionist might do could engage wider audiences far more than the “cold lists of data and conclusions” in many scientific papers.</p> <p>Burnard said she fully expects the book, which features plenty of other, different examples of rebellious scholarly writing, to be “written off” by some scholars. “Our ideas and intentions are challenging – but that’s something that academics are meant to be,” she added. “ ֱ̽emergence of unimagined possibilities should be celebrated.”</p> <p><em>Doing Rebellious Research in and beyond the Academy</em> is published by Brill-i-Sense. It will be widely available following a launch event in Cambridge on Monday 6 June.</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A group of education specialists are urging researchers to challenge the “structures and regulations” which define academic scholarship, arguing that different approaches are needed in an age of climate change, COVID-19 and rising populism.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">“Nobody is claiming that academic writing is pointless, but why is it the norm? If we want research to address the biggest challenges facing society, we need academics to have the confidence – in a sense the permission – to depart radically from it. We need to be braver and take more risks with what we do.” </div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Pamela Burnard</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Joakim Björklund</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even"> Student performing on rigging at the ֱ̽ of Stockholm’s ‘Department of Circus’, which explores different disciplines through circus arts.</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 06 Jun 2022 06:46:37 +0000 tdk25 232601 at ‘Robot scientist’ Eve finds that less than one third of scientific results are reproducible /research/news/robot-scientist-eve-finds-that-less-than-one-third-of-scientific-results-are-reproducible <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/breast-cancer-cell.jpg?itok=A3oLbOmf" alt="Breast Cancer Cell" title="Breast Cancer Cell, Credit: NIH Image Gallery" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, led by the ֱ̽ of Cambridge, analysed more than 12,000 research papers on breast cancer cell biology. After narrowing the set down to 74 papers of high scientific interest, less than one-third – 22 papers – were found to be reproducible. In two cases, Eve was able to make serendipitous discoveries.</p> <p> ֱ̽<a href="https://royalsocietypublishing.org/doi/10.1098/rsif.2021.0821">results</a>, reported in the journal <em>Royal Society Interface</em>, demonstrate that it is possible to use robotics and artificial intelligence to help address the reproducibility crisis.</p> <p>A successful experiment is one where another scientist, in a different laboratory under similar conditions, can achieve the same result. But more than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce some of their own experiments: this is the reproducibility crisis.</p> <p>“Good science relies on results being reproducible: otherwise, the results are essentially meaningless,” said Professor Ross King from Cambridge’s Department of Chemical Engineering and Biotechnology, who led the research. “This is particularly critical in biomedicine: if I’m a patient and I read about a promising new potential treatment, but the results aren’t reproducible, how am I supposed to know what to believe? ֱ̽result could be people losing trust in science.”</p> <p>Several years ago, King developed the robot scientist Eve, a computer/robotic system that uses techniques from artificial intelligence (AI) to carry out scientific experiments.</p> <p>“One of the big advantages of using machines to do science is they’re more precise and record details more exactly than a human can,” said King. “This makes them well-suited to the job of attempting to reproduce scientific results.”</p> <p>As part of a project funded by DARPA, King and his colleagues from the UK, US and Sweden designed an experiment that uses a combination of AI and robotics to help address the reproducibility crisis, by getting computers to read scientific papers and understand them, and getting Eve to attempt to reproduce the experiments.</p> <p>For the current paper, the team focused on cancer research. “ ֱ̽cancer literature is enormous, but no one ever does the same thing twice, making reproducibility a huge issue,” said King, who also holds a position at Chalmers ֱ̽ of Technology in Sweden. “Given the vast sums of money spent on cancer research, and the sheer number of people affected by cancer worldwide, it’s an area where we urgently need to improve reproducibility.”</p> <p>From an initial set of more than 12,000 published scientific papers, the researchers used automated text mining techniques to extract statements related to a change in gene expression in response to drug treatment in breast cancer. From this set, 74 papers were selected.</p> <p>Two different human teams used Eve and two breast cancer cell lines and attempted to reproduce the 74 results. Statistically significant evidence for repeatability was found for 43 papers, meaning that the results were replicable under identical conditions; and significant evidence for reproducibility or robustness was found in 22 papers, meaning the results were replicable by different scientists under similar conditions. In two cases, the automation made serendipitous discoveries.</p> <p>While only 22 out of 74 papers were found to be reproducible in this experiment, the researchers say that this does not mean that the remaining papers are not scientifically reproducible or robust. “There are lots of reasons why a particular result may not be reproducible in another lab,” said King. “Cell lines can sometimes change their behaviour in different labs under different conditions, for instance. ֱ̽most important difference we found was that it matters who does the experiment, because every person is different.”</p> <p>King says that this work shows that automated and semi-automated techniques could be an important tool to help address the reproducibility crisis, and that reproducibility should become a standard part of the scientific process.</p> <p>“It’s quite shocking how big of an issue reproducibility is in science, and it’s going to need a complete overhaul in the way that a lot of science is done,” said King. “We think that machines have a key role to play in helping to fix it.”</p> <p> ֱ̽research was also funded by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI), and the Wallenberg AI, Autonomous Systems and Software Program (WASP)</p> <p> </p> <p><em><strong>Reference:</strong><br /> Katherine Roper et al. ‘<a href="https://royalsocietypublishing.org/doi/10.1098/rsif.2021.0821">Testing the reproducibility and robustness of the cancer biology literature by robot</a>.’ Royal Society Interface (2022). DOI: 10.1098/rsif.2021.0821</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have used a combination of automated text analysis and the ‘robot scientist’ Eve to semi-automate the process of reproducing research results. ֱ̽problem of lack of reproducibility is one of the biggest crises facing modern science.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">One of the big advantages of using machines to do science is they’re more precise and record details more exactly than a human can</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Ross King</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.flickr.com/photos/132318516@N08/28264909965" target="_blank">NIH Image Gallery</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Breast Cancer Cell</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution-noncommerical">Attribution-Noncommerical</a></div></div></div> Tue, 05 Apr 2022 23:20:02 +0000 sc604 231261 at