ֱ̽ of Cambridge - Cecilia Mascolo /taxonomy/people/cecilia-mascolo en Cambridge researchers awarded ERC funding to support commercial potential of their work /research/news/cambridge-researchers-awarded-erc-funding-to-support-commercial-potential-of-their-work <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/erc-poc.jpg?itok=UoZGh5AJ" alt="Left: Cecilia Mascolo, Right: Ismail Sami" title="Left: Cecilia Mascolo, Right: Ismail Sami, Credit: ֱ̽ of Cambridge" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Professor Cecilia Mascolo from the Department of Computer Science and Technology will use the funding to further her work on developing mobile devices – like commercially-available earbuds – that can accurately pick up wearers’ body sounds and monitor them for health purposes.</p>&#13; &#13; <p> ֱ̽ERC Proof of Concept grants – worth €150,000 – help researchers bridge the gap between the discoveries stemming from their frontier research and the practical application of the findings, including the early phases of their commercialisation.</p>&#13; &#13; <p>Researchers use this type of funding to verify the practical viability of scientific concepts, explore business opportunities or prepare patent applications.</p>&#13; &#13; <p>Mascolo’s existing ERC-funded Project EAR was the first to demonstrate that the existing microphones in earbuds can be used to pick up wearers’ levels of activity and heart rate and to trace it accurately even when the wearer is exercising vigorously.</p>&#13; &#13; <p>She now wants to build on this work by enhancing the robustness of these in-ear microphones and further improve their performance in monitoring human activity and physiology in 'real life' conditions, including by developing new algorithms to help the devices analyse the data they are collecting.</p>&#13; &#13; <p>“There are currently no solutions on the market that use audio devices to detect body function signals like this and they could play an extremely valuable role in health monitoring,” said Mascolo. “Because the devices’ hardware, computing needs and energy consumption are inexpensive, they could put body function monitoring into the hands of the world's population accurately and affordably.”</p>&#13; &#13; <p>Professor Manish Chhowalla from the Department of Materials Science and Metallurgy was awarded a Proof of Concept Grant to demonstrate large-scale and high-performance lithium-sulfur batteries.</p>&#13; &#13; <p>“Our breakthrough in lithium-sulphur batteries demonstrates a future beyond lithium-ion batteries; moving away from the reliance on critical raw materials and enabling the electrification of fundamentally new applications such as aviation,” said Dr Ismail Sami, Research Fellow in Chhowalla’s group. “This Proof of Concept will help us take the essential commercial and technical steps in bringing our innovation to market.”</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽ of Cambridge researchers have been awarded Proof of Concept grants from the European Research Council (ERC), to help them explore the commercial or societal potential of their research. ֱ̽funding is part of the EU's research and innovation programme, Horizon Europe.</p>&#13; </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank"> ֱ̽ of Cambridge</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Left: Cecilia Mascolo, Right: Ismail Sami</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 02 Aug 2023 16:05:46 +0000 sc604 241111 at Fitness levels accurately predicted using wearable devices – no exercise required /research/news/fitness-levels-can-be-accurately-predicted-using-wearable-devices-no-exercise-required <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/fitness-monitor.jpg?itok=wvdgtpK6" alt="Woman checking her smart watch and mobile phone after run" title="Woman checking her smart watch and mobile phone after run, Credit: Oscar Wong via Getty Images" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Normally, tests to accurately measure VO2max – a key measurement of overall fitness and an important predictor of heart disease and mortality risk – require expensive laboratory equipment and are mostly limited to elite athletes. ֱ̽new method uses machine learning to predict VO2max – the capacity of the body to carry out aerobic work – during everyday activity, without the need for contextual information such as GPS measurements.</p>&#13; &#13; <p>In what is by far the largest study of its kind, the researchers gathered activity data from more than 11,000 participants in the Fenland Study using wearable sensors, with a subset of participants tested again seven years later. ֱ̽researchers used the data to develop a model to predict VO2max, which was then validated against a third group that carried out a standard lab-based exercise test. ֱ̽model showed a high degree of accuracy compared to lab-based tests, and outperforms other approaches.</p>&#13; &#13; <p>Some smartwatches and fitness monitors currently on the market claim to provide an estimate of VO2max, but since the algorithms powering these predictions aren’t published and are subject to change at any time, it’s unclear whether the predictions are accurate, or whether an exercise regime is having any effect on an individual’s VO2max over time.</p>&#13; &#13; <p> ֱ̽Cambridge-developed model is robust, transparent and provides accurate predictions based on heart rate and accelerometer data only. Since the model can also detect fitness changes over time, it could also be useful in estimating fitness levels for entire populations and identifying the effects of lifestyle trends. <a href="https://www.nature.com/articles/s41746-022-00719-1"> ֱ̽results are reported in the journal <em>npj Digital Medicine</em></a>.</p>&#13; &#13; <p>A measurement of VO2max is considered the ‘gold standard’ of fitness tests. Professional athletes, for example, test their VO2max by measuring their oxygen consumption while they exercise to the point of exhaustion. There are other ways of measuring fitness in the laboratory, like heart rate response to exercise tests, but these require equipment like a treadmill or exercise bike. Additionally, strenuous exercise can be a risk to some individuals.</p>&#13; &#13; <p>“VO2max isn’t the only measurement of fitness, but it’s an important one for endurance, and is a strong predictor of diabetes, heart disease, and other mortality risks,” said co-author Dr Soren Brage from Cambridge’s Medical Research Council (MRC) Epidemiology Unit. “However, since most VO2max tests are done on people who are reasonably fit, it’s hard to get measurements from those who are not as fit and might be at risk of cardiovascular disease.”</p>&#13; &#13; <p>“We wanted to know whether it was possible to accurately predict VO2max using data from a wearable device, so that there would be no need for an exercise test,” said co-lead author Dr Dimitris Spathis from Cambridge’s Department of Computer Science and Technology. “Our central question was whether wearable devices can measure fitness in the wild. Most wearables provide metrics like heart rate, steps or sleeping time, which are proxies for health, but aren’t directly linked to health outcomes.”</p>&#13; &#13; <p> ֱ̽study was a collaboration between the two departments: the team from the MRC Epidemiology Unit provided expertise in population health and cardiorespiratory fitness and data from the Fenland Study – a long-running public health study in the East of England – while the team from the Department of Computer Science and Technology provided expertise in machine learning and artificial intelligence for mobile and wearable data.</p>&#13; &#13; <p>Participants in the study wore wearable devices continuously for six days. ֱ̽sensors gathered 60 values per second, resulting in an enormous amount of data before processing. “We had to design an algorithm pipeline and appropriate models that could compress this huge amount of data and use it to make an accurate prediction,” said Spathis. “ ֱ̽free-living nature of the data makes this prediction challenging because we’re trying to predict a high-level outcome (fitness) with noisy low-level data (wearable sensors).”</p>&#13; &#13; <p> ֱ̽researchers used an AI model known as a deep neural network to process and extract meaningful information from the raw sensor data and make predictions of VO2max from it. Beyond predictions, the trained models can be used for the identification of sub-populations in particular need of intervention related to fitness.</p>&#13; &#13; <p> ֱ̽baseline data from 11,059 participants in the Fenland Study was compared with follow-up data from seven years later, taken from a subset of 2,675 of the original participants. A third group of 181 participants from the UK Biobank Validation Study underwent lab-based VO2max testing to validate the accuracy of the algorithm. ֱ̽machine learning model had strong agreement with the measured VO2max scores at both baseline (82% agreement) and follow-up testing (72% agreement).</p>&#13; &#13; <p>“This study is a perfect demonstration of how we can leverage expertise across epidemiology, public health, machine learning and signal processing,” said co-lead author Dr Ignacio Perez-Pozuelo.</p>&#13; &#13; <p> ֱ̽researchers say that their results demonstrate how wearables can accurately measure fitness, but transparency needs to be improved if measurements from commercially available wearables are to be trusted.</p>&#13; &#13; <p>“It’s true in principle that many fitness monitors and smartwatches provide a measurement of VO2max, but it’s very difficult to assess the validity of those claims,” said Brage. “ ֱ̽models aren’t usually published, and the algorithms can change on a regular basis, making it difficult for people to determine if their fitness has actually improved or if it’s just being estimated by a different algorithm.”</p>&#13; &#13; <p>“Everything on your smartwatch related to health and fitness is an estimate,” said Spathis. “We’re transparent about our modelling and we did it at scale. We show that we can achieve better results with the combination of noisy data and traditional biomarkers. Also, all our algorithms and models are open-sourced and everyone can use them.”</p>&#13; &#13; <p>“We’ve shown that you don’t need an expensive test in a lab to get a real measurement of fitness – the wearables we use every day can be just as powerful, if they have the right algorithm behind them,” said senior author Professor Cecilia Mascolo from the Department of Computer Science and Technology. “Cardio-fitness is such an important health marker, but until now we did not have the means to measure it at scale. These findings could have significant implications for population health policies, so we can move beyond weaker health proxies such as the Body Mass Index (BMI).”</p>&#13; &#13; <p> ֱ̽research was supported in part by Jesus College, Cambridge and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Cecilia Mascolo is a Fellow of Jesus College, Cambridge.</p>&#13; &#13; <p> </p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Dimitris Spathis et al. ‘<a href="https://www.nature.com/articles/s41746-022-00719-1">Longitudinal cardio-respiratory fitness prediction through wearables in free-living environments</a>.’ npj Digital Medicine (2022). DOI: 10.1038/s41746-022-00719-1</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge researchers have developed a method for measuring overall fitness accurately on wearable devices – and more robustly than current consumer smartwatches and fitness monitors – without the wearer needing to exercise.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">You don’t need an expensive test in a lab to get a real measurement of fitness – the wearables we use every day can be just as powerful, if they have the right algorithm behind them</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Cecilia Mascolo</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.gettyimages.co.uk/detail/photo/woman-checking-her-smart-watch-and-mobile-phone-royalty-free-image/1257794436?phrase=fitness monitor&amp;amp;adppopup=true" target="_blank">Oscar Wong via Getty Images</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Woman checking her smart watch and mobile phone after run</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 01 Dec 2022 10:00:18 +0000 sc604 235691 at On the move /stories/nokia-bell-labs <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Wearable devices will play an important part in the healthcare delivery of the future. Cambridge and Nokia Bell Labs are working together to build the foundations for these transformative new technologies. </p> </p></div></div></div> Thu, 02 Jul 2020 14:35:14 +0000 skbf2 215932 at New app collects the sounds of COVID-19 /research/news/new-app-collects-the-sounds-of-covid-19 <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/23354lores.jpg?itok=FsMKmS-q" alt="" title="Transmission electron microscopic image of an isolate from the first US case of COVID-19, Credit: CDC/ Hannah A Bullock; Azaibi Tamin" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽<a href="https://www.covid-19-sounds.org/">COVID-19 Sounds App</a> is now available as a web app for Chrome and Firefox browsers. Versions for Android and iOS will be available soon.</p> <p>As COVID-19 is a respiratory condition, the sounds made by people with the condition – including voice, breathing and cough sounds – are very specific. A large, crowdsourced data set will be useful in developing machine learning algorithms that could be used for automatic detection of the condition.</p> <p>"There’s still so much we don’t know about this virus and the illness it causes, and in a pandemic situation like the one we’re currently in, the more reliable information you can get, the better," said Professor Cecilia Mascolo from Cambridge’s Department of Computer Science and Technology, who led the development of the app.</p> <p>"I am amazed at the speed that we managed to connect across the ֱ̽ to conceive this project, and how Cecilia's team of developers came together to respond to the urgency of the situation," said Professor Pietro Cicuta from Cambridge’s Cavendish Laboratory, a member of the team behind the app’s development. Professor Andres Floto, Professor of Respiratory Biology at the ֱ̽, and Research Director of the Cambridge Centre for Lung Infection at Papworth Hospital, Cambridge, has also advised on the clinical aspects of the app.</p> <p> ֱ̽COVID-19 Sounds App collects basic demographic and medical information from users, as well as spoken voice samples, breathing and coughing samples through the phone’s microphone. ֱ̽app will also ask users if they have tested positive for the coronavirus.</p> <p>In addition, the app will collect one coarse grain location sample. ֱ̽app will not track users, and will only collect location data once when users are actively using it. ֱ̽data will be stored on ֱ̽ servers and be used solely for research purposes. ֱ̽app will not provide any medical advice.</p> <p>Once they have completed their initial analysis of the data collected by the app, the team will release the dataset to other researchers. ֱ̽dataset could help shed light on disease progression, further relationship of the respiratory complication with medical history, for example.</p> <p>"Having spoken to doctors, one of the most common things they have noticed about patients with the virus is the way they catch their breath when they’re speaking, as well as a dry cough, and the intervals of their breathing patterns," said Mascolo. "There are very few large datasets of respiratory sounds, so to make better algorithms that could be used for early detection, we need as many samples from as many participants as we can get. Even if we don’t get many positive cases of coronavirus, we could find links with other health conditions."</p> <p> ֱ̽study has been approved by the Ethics Committee of the Department of Computer Science and Technology, and is partly funded by the European Research Council through Project EAR.</p> <h2>How you can support Cambridge's COVID-19 research effort</h2> <p><a href="https://www.philanthropy.cam.ac.uk/civicrm/contribute/transact?reset=1&amp;id=2962" title="Link: Make a gift to support COVID-19 research at the ֱ̽">Donate to support COVID-19 research at Cambridge</a></p> <p> </p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A new app, which will be used to collect data to develop machine learning algorithms that could automatically detect whether a person is suffering from COVID-19 based on the sound of their voice, their breathing and coughing, has been launched by researchers at the ֱ̽ of Cambridge.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://phil.cdc.gov/Details.aspx?pid=23354" target="_blank">CDC/ Hannah A Bullock; Azaibi Tamin</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Transmission electron microscopic image of an isolate from the first US case of COVID-19</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width: 0px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 06 Apr 2020 12:31:15 +0000 sc604 213412 at Virtual reality can spot navigation problems in early Alzheimer’s disease /research/news/virtual-reality-can-spot-navigation-problems-in-early-alzheimers-disease <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/env1cone1.jpg?itok=x-RzCW7j" alt="" title="Example environment from the virtual reality display , Credit: ֱ̽ of Cambridge" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽study highlights the potential of new technologies to help diagnose and monitor conditions such as Alzheimer’s disease, which affects more than 525,000 people in the UK. </p> <p>In 2014, Professor John O’Keefe of UCL was jointly awarded the Nobel Prize in Physiology or Medicine for ‘discoveries of cells that constitute a positioning system in the brain’. Essentially, this means that the brain contains a mental ‘satnav’ of where we are, where we have been, and how to find our way around.</p> <p>A key component of this internal satnav is a region of the brain known as the entorhinal cortex. This is one of the first regions to be damaged in Alzheimer’s disease, which may explain why ‘getting lost’ is one of the first symptoms of the disease. However, the pen-and-paper cognitive tests used in clinic to diagnose the condition are unable to test for navigation difficulties.</p> <p>In collaboration with Professor Neil Burgess at UCL, a team of scientists at the Department of Clinical Neurosciences at the ֱ̽ of Cambridge led by Dr Dennis Chan, previously Professor O’Keefe’s PhD student, developed and trialled a VR navigation test in patients at risk of developing dementia. ֱ̽results of their study are published today in the journal Brain.</p> <p>In the test, a patient dons a VR headset and undertakes a test of navigation while walking within a simulated environment. Successful completion of the task requires intact functioning of the entorhinal cortex, so Dr Chan’s team hypothesised that patients with early Alzheimer’s disease would be disproportionately affected on the test.</p> <p> ֱ̽team recruited 45 patients with mild cognitive impairment (MCI) from the Cambridge ֱ̽ Hospitals NHS Trust Mild Cognitive Impairment and Memory Clinics, supported by the Windsor Research Unit at Cambridgeshire and Peterborough NHS Foundation Trust. Patients with MCI typically exhibit memory impairment, but while MCI can indicate early Alzheimer’s, it can also be caused by other conditions such as anxiety and even normal aging. As such, establishing the cause of MCI is crucial for determining whether affected individuals are at risk of developing dementia in the future.  </p> <p> ֱ̽researchers took samples of cerebrospinal fluid (CSF) to look for biomarkers of underlying Alzheimer’s disease in their MCI patients, with 12 testing positive. ֱ̽researchers also recruited 41 age-matched healthy controls for comparison.</p> <p>All of the patients with MCI performed worse on the navigation task than the healthy controls. However, the study yielded two crucial additional observations. First, MCI patients with positive CSF markers – indicating the presence of Alzheimer’s disease, thus placing them at risk of developing dementia – performed worse than those with negative CSF markers at low risk of future dementia.</p> <p>Secondly, the VR navigation task was better at differentiating between these low and high risk MCI patients than a battery of currently-used tests considered to be gold standard for the diagnosis of early Alzheimer’s.</p> <p>“These results suggest a VR test of navigation may be better at identifying early Alzheimer’s disease than tests we use at present in clinic and in research studies,” says Dr Chan.</p> <p>VR could also help clinical trials of future drugs aimed at slowing down, or even halting, progression of Alzheimer’s disease. Currently, the first stage of drug trials involves testing in animals, typically mouse models of the disease. To determine whether treatments are effective, scientists study their effect on navigation using tests such as a water maze, where mice have to learn the location of hidden platforms beneath the surface of opaque pools of water. If new drugs are found to improve memory on this task, they proceed to trials in human subjects, but using word and picture memory tests. This lack of comparability of memory tests between animal models and human participants represents a major problem for current clinical trials.</p> <p>“ ֱ̽brain cells underpinning navigation are similar in rodents and humans, so testing navigation may allow us to overcome this roadblock in Alzheimer’s drug trials and help translate basic science discoveries into clinical use,” says Dr Chan. “We’ve wanted to do this for years, but it’s only now that VR technology has evolved to the point that we can readily undertake this research in patients.”</p> <p>In fact, Dr Chan believes technology could play a crucial role in diagnosing and monitoring Alzheimer’s disease. He is working with Professor Cecilia Mascolo at Cambridge’s Centre for Mobile, Wearable Systems and Augmented Intelligence to develop apps for detecting the disease and monitoring its progression. These apps would run on smartphones and smartwatches. As well as looking for changes in how we navigate, the apps will track changes in other everyday activities such as sleep and communication.</p> <p>“We know that Alzheimer’s affects the brain long before symptoms become apparent,” says Dr Chan. “We’re getting to the point where everyday tech can be used to spot the warning signs of the disease well before we become aware of them.</p> <p>“We live in a world where mobile devices are almost ubiquitous, and so app-based approaches have the potential to diagnose Alzheimer’s disease at minimal extra cost and at a scale way beyond that of brain scanning and other current diagnostic approaches.”</p> <p> ֱ̽VR research was funded by the Medical Research Council and the Cambridge NIHR Biomedical Research Centre. ֱ̽app-based research is funded by the Wellcome, the European Research Council and the Alan Turing Institute.</p> <p><em><strong>Reference</strong><br /> Howett, D, Castegnaro, A, et al. <a href="https://academic.oup.com/brain/article-lookup/doi/10.1093/brain/awz116">Differentiation of mild cognitive impairment using an entorhinal cortex based test of VR navigation.</a> Brain; 28 May 2019; DOI: 10.1093/brain/awz116</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Virtual reality (VR) can identify early Alzheimer’s disease more accurately than ‘gold standard’ cognitive tests currently in use, suggests new research from the ֱ̽ of Cambridge.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We’ve wanted to do this for years, but it’s only now that virtual reality technology has evolved to the point that we can readily undertake this research in patients</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Dennis Chan</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank"> ֱ̽ of Cambridge</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Example environment from the virtual reality display </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 23 May 2019 23:19:36 +0000 cjb250 205502 at Cambridge researchers win European Research Council funding /research/news/cambridge-researchers-win-european-research-council-funding <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/crop_110.jpg?itok=mGB6Ivl7" alt="Left to right: Christopher Reynolds, Cecilia Mascolo, Alfonso Martinez Arias" title="Left to right: Christopher Reynolds, Cecilia Mascolo, Alfonso Martinez Arias, Credit: ֱ̽ of Cambridge" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Two hundred and twenty-two senior scientists from across Europe were awarded grants in today’s announcement, representing a total of €540 million in research funding. ֱ̽UK has 47 grantees in this year’s funding round, the most of any ERC participating country.</p> <p>ERC grants are awarded through open competition to projects headed by starting and established researchers, irrespective of their origins, who are working or moving to work in Europe. ֱ̽sole criterion for selection is scientific excellence.</p> <p>ERC Advanced Grants are designed to support excellent scientists in any field with a recognised track record of research achievements in the last ten years.</p> <p><strong>Professor Clare Grey</strong> from the Department of Chemistry, and a Fellow of Pembroke College, leads a project focused on the development of longer lasting, higher energy density and cheaper rechargeable batteries, one of society’s major technological challenges. Batteries are currently the limiting components in the shift from gasoline-powered to electric vehicles.</p> <p>Using a variety of experimental techniques, including dynamic nuclear polarisation NMR spectroscopy, Grey and her team will explore a variety of different battery chemistries, including more traditional lithium-ion and newer solid state and redox-flow batteries, with a particular focus on understanding the interfaces and interphases that form in these systems. ֱ̽interdisciplinary project combines analytical and physical chemistry, materials characterisation, electrochemistry and electronic structures of materials, interfaces and nanoparticles. ֱ̽final result will be a significantly improved understanding of the structures of new types of batteries and how they evolve during the charge-discharge cycle, coupled with strategies for designing improved battery structures.</p> <p><strong>Professor Cecilia Mascolo</strong> from the Department of Computer Science and Technology, and a Fellow of Jesus College, will lead a project focused on the use of mobile devices for medical diagnostics. Mascolo and her team will study how the microphone in mobile and wearable devices may be used to diagnose and monitor various health conditions since sounds from the human body can be indicators of disease or the onsets of disease.</p> <p>While audio sensing in a mobile context is inexpensive to deploy and can reach people who may not have access to or be able to afford other diagnostic tests, it does come with challenges which threaten its use in clinical context: namely its power-hungry nature and the sensitivity of the data it collects. Mascolo’s ERC funding will support the development of a systematic framework to link sounds to disease diagnosis while addressing power consumption and privacy concerns by maximising the use of local hardware resources with power optimisation and accuracy.</p> <p><strong>Professor Christopher Reynolds</strong> from the Institute of Astronomy, and a Fellow of Sidney Sussex College, leads a project focused on the feedback from supermassive black holes at the centre of galaxies. These supermassive black holes have a profound influence on the evolution of galaxies and galaxy groups/clusters, but fundamental questions remain.</p> <p>To help address these questions, Reynolds and his team are studying the highly luminous central regions of galaxies around the black hole, known as active galactic nuclei (AGN). Reynolds’ ERC funding will support a set of projects to explore the multi-scale physics of AGN feedback. A new theoretical understanding of AGN feedback as a function of mass, environment, and cosmic time will be essential for interpreting the torrent of data from current and future observatories, and understanding some of the most powerful phenomena in the universe.</p> <p><strong>Professor Alfonso Martinez Arias</strong> from the Department of Genetics will lead a project focused on understanding the early stages of mammalian embryogenesis. ֱ̽development of an embryo requires the spatially structured emergence of tissues and organs, a process which relies on the early establishment of a coordinate system that acts as a template for the organism. Exactly how this process occurs is an open question and one which is difficult to investigate experimentally, particularly in mammals.</p> <p>Using gastruloids, a stem cell-based experimental system they have developed, Martinez Arias and his team will probe into the functional relationships between the mechanical activities of multicellular ensembles and the dynamics that control the organisation and shape of the mammalian body plan: the arrangement of tissue and organs with reference to a global coordinate system.</p> <p>Finally, <strong>Professor Austin Smith</strong> from the Wellcome - MRC Cambridge Stem Cell Institute and the Department of Biochemistry will lead a project on the plasticity of the pluripotent stem cell network. Pluripotent stem cells have the potential to become any of the cells and tissues in the body, but the evolutionary origins of this phenomenon are unclear.</p> <p>Using a cross-disciplinary approach, Smith and his team hope to uncover the core biological programme moulded by evolution into different forms. ֱ̽team are investigating the molecular logic governing early development, lineage plasticity, pluripotent identity and stem cell self-renewal. </p> <p> ֱ̽President of the European Research Council (ERC), Professor Jean-Pierre Bourguignon, said: “Since 2007, the European Research Council has attracted and financed some of the most audacious research proposals, and independent evaluations show that this approach has paid off. With this call, another 222 researchers from all over Europe and beyond will pursue their best ideas and are in an excellent position to trigger breakthroughs and major scientific advances.”</p> <p>Carlos Moedas, European Commissioner for Research, Science and Innovation, said: “ ֱ̽ERC Advanced Grants back outstanding researchers throughout Europe. Their pioneering work has the potential to make a difference in people’s everyday life and deliver solutions to some of our most urgent challenges. ֱ̽ERC gives these bright minds the possibility to follow their most creative ideas and to play a decisive role in the advancement of all domains of knowledge.”</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Five researchers at the ֱ̽ of Cambridge have won advanced grants from the European Research Council (ERC), Europe’s premier research funding body. </p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"> ֱ̽ERC gives these bright minds the possibility to follow their most creative ideas and to play a decisive role in the advancement of all domains of knowledge</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Carlos Moedas</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank"> ֱ̽ of Cambridge</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Left to right: Christopher Reynolds, Cecilia Mascolo, Alfonso Martinez Arias</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 28 Mar 2019 11:00:00 +0000 sc604 204472 at Young at heart /stories/young-at-heart <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Improvements in public health, education and medicine mean that our lives are much longer than at any point in human history. Thanks to studies of volunteers from the eastern region, we may be able to spend these extra years living independently and in good health.</p> </p></div></div></div> Wed, 13 Mar 2019 14:03:28 +0000 cjb250 204042 at Cambridge and Nokia Bell Labs establish new research centre to advance AI-supported multi-sensory personal devices /news/cambridge-and-nokia-bell-labs-establish-new-research-centre-to-advance-ai-supported-multi-sensory <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/news/computercodebymarkusspiske580.jpg?itok=e_ZkqQFQ" alt="Computer code. Image by Markus Spiske" title="Computer code, Credit: Markus Spiske" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Nokia Bell Labs is a founding partner of the new Centre for Mobile, Wearable Systems and Augmented Intelligence, to be based in Cambridge’s world-leading Department of Computer Science and Technology. ֱ̽Centre will advance state-of-the-art mobile systems, security, new materials, and artificial intelligence (AI) to address one of the main human needs – the ability to communicate better with each other.</p> <p> ֱ̽collaboration pairs two innovation powerhouses. Nokia Bell Labs in Cambridge conducts research on novel sensors based on emerging materials, embedded and network intelligence, and computational social science. ֱ̽ ֱ̽’s Department of Computer Science and Technology is expert in analysing mobile data and systems research in real world applications with quantifiable impact.</p> <h2>Research into mobile, wearable and augmented intelligence systems</h2> <p> ֱ̽research jointly conducted in the new Centre will redefine the way people interact with the digital and physical world. Areas of focus include precise, predictive and personalised medicine, digital, physical, mental, and social well-being, and sensory human communication experiences beyond visual and audio. ֱ̽Centre will be directed by Cecilia Mascolo, Professor of Mobile Systems, and Alastair Beresford, Reader in Computer Security.</p> <p>“ ֱ̽new Centre provides support for high-quality, long-term research into mobile, wearable and augmented intelligence systems in Cambridge,” said Professor Mascolo. “In addition, the Centre will also engage with other researchers across the UK and abroad. We will formally launch the new Centre with a research symposium later in the year, with researchers drawn from across the UK and beyond.”</p> <p>“Mobile systems have transformed our lives and evolved dramatically over the last 20 years,” said Dr Beresford. “However, there are many big changes to come, and our research will ensure we have the right technical solutions as well as appropriate safeguards available.”</p> <h2>Establishing a dynamic research community</h2> <p> ֱ̽Centre will be used to establish a vibrant research community, and support Nokia Bell Labs PhD Studentships as well as postdoctoral researchers over the next five years. It will also support the wider research community with a range of events, workshops and seminars. ֱ̽official opening and first academic research symposium will take place in September.</p> <p>Markus Hofmann, Head of Applications, Platforms and Software Systems Research at Nokia Bell Labs said: “We are very excited to participate in the creation of this new Centre at Cambridge. We look forward to solving the key technical challenges as we move towards our shared goal – to provide people with enhanced awareness of their world, to help them better sense and interpret their digital and physical environment, to enable the long-distance exchange of people’s emotions and perceptions, to augment and improve the human experience in a digitally connected world.”</p> <p>Julie Byrne, Head of External Collaboration Programs at Nokia Bell Labs, said: “We established our Distinguished Academic Partner Program to bring together the best and brightest minds to solve human need challenges by delivering disruptive innovations. We are delighted to be a founding partner of this new Centre and to bring the world-leading Department of Computer Science and Technology at Cambridge to our collaboration program.”</p> <p> </p> <h2>More about the impact of philanthropic giving</h2> <p><a href="https://www.philanthropy.cam.ac.uk/impact-of-giving/impact-stories" title="Link: Philanthropic impact &#10;&#10;stories">Read other examples of the positive impact of philanthropy at Cambridge</a></p> <p> </p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽long-sought dream of wearable and mobile devices that will interpret, replicate and influence people’s emotions and perceptions will soon be a reality thanks to a collaboration between the ֱ̽ and Nokia Bell Labs.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Markus Spiske</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Computer code</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 20 Jun 2018 14:00:23 +0000 plc32 198252 at