ֱ̽ of Cambridge - Department of Radiology /taxonomy/affiliations/department-of-radiology News from the Department of Radiology. en AI cuts waiting times for cancer patients in NHS first /research/news/ai-cuts-waiting-times-for-cancer-patients-in-nhs-first <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/2-rajeshjena-radiotherapy.jpg?itok=s0CsD1yL" alt="Dr Raj Jena in front of a computer" title="Dr Raj Jena, Credit: CUH" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>'OSAIRIS' is saving many hours of doctors’ time in preparing scans and helping to cut the time patients have to wait between referral for radiotherapy and starting treatment.</p>&#13; &#13; <p>Working alongside this AI technology, specialists can plan radiotherapy treatments approximately two and half times faster than if they were working alone, ensuring more patients can get treatment sooner and improving the likelihood of cure.</p>&#13; &#13; <p> ֱ̽technology is currently being used at Addenbrooke’s for prostate and head and neck cancers, but has the potential to work for many other types of cancer, benefitting patients across the NHS. </p>&#13; &#13; <p>Dr Raj Jena, oncologist at Cambridge ֱ̽ Hospitals NHS Foundation Trust, led the research for the NHS and ֱ̽ of Cambridge. His research includes long- term collaborations with Microsoft Research on an AI research project known as Project InnerEye to develop machine learning techniques to support the global medical imaging community. To broaden access to research in this field, Microsoft Research made available Project InnerEye toolkits as open-source software.</p>&#13; &#13; <p>With a £500,000 grant from the NHS AI Lab, Dr Jena’s team created a new AI tool, OSAIRIS, using open-source software from Project InnerEye and data from patients who had previously been treated in the hospital and agreed to contribute to the research.</p>&#13; &#13; <p>OSAIRIS works by significantly cutting the amount of time a doctor needs to spend drawing around healthy organs on scans before radiotherapy.  Outlining the organs, known as ‘segmentation’, is critical in order to protect the healthy tissue around the cancer from radiation. It can take a doctor between 20 minutes and three hours to perform this task, per patient. This complex but routine task is ideally suited to AI with the oncologist in control, checking every scan after OSAIRIS has done the segmentation.   </p>&#13; &#13; <p>Rigorous tests and risk assessments have been carried out to ensure OSAIRIS is safe and can be used in the day-to-day care of radiotherapy patients across the NHS. In masked tests, known as ‘Turing tests’, doctors were unable to tell the difference between the work of OSAIRIS and the work of a doctor colleague.   </p>&#13; &#13; <p>Dr Raj Jena said: “OSAIRIS does much of the work in the background so that when the oncologist sits down to start planning treatment, most of the heavy lifting is done.  It is the first cloud-based AI technology to be developed and deployed within the NHS. 18 months of rigorous testing will enable us to share this technology safely across the NHS for patient benefit.” </p>&#13; &#13; <p>“We’ve already started to work on a model that works in the chest, so that will work for lung cancer and breast cancer particularly,” he explains. “And also, from my perspective as a neurooncologist, I’m interested that we’re building the brain model as well so that we’ve got something that works for brain tumours as well.”</p>&#13; &#13; <p> Aditya Nori, General Manager of Healthcare for Microsoft Research, said: “By combining the power of AI with the world-class clinical expertise of the NHS, we have an amazing opportunity for revolutionising healthcare together, while preserving the human element that is the essence of high-quality and safe care.”</p>&#13; &#13; <p>“Healthcare offers the possibility not only to have technical impact but also societal impact, so I am really thrilled about this. ֱ̽fact that we have AI finally in the NHS will also open the doors for other kinds of AI technologies to really reduce the burden that’s placed on clinicians, and more importantly, improve patient safety, outcomes, and experiences.”</p>&#13; &#13; <p>Health and Social Care Secretary Steve Barclay said: “Cutting edge technology can help us reduce waiting times for cancer patients, free up time for staff so they can focus on patient care, and ultimately save lives – and artificial intelligence is playing an increasingly important role.</p>&#13; &#13; <p>“Backed by £500,000 in government funding, the team at Addenbrooke’s Hospital are utilising the innovative OSAIRIS tool to speed up radiotherapy scans at more than twice the normal rate - reducing the time it takes to start potentially life-saving treatment.</p>&#13; &#13; <p>“It will also help ease the pressure on the NHS and cut waiting lists, one of the government’s five priorities.</p>&#13; &#13; <p>“In the year the NHS turns 75 we are investing in its future and last week announced a new £21 million fund for Trusts to deploy AI tools in a safe and controlled way to speed up the diagnosis and treatment for a range of conditions.”</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Artificial intelligence developed by and for the NHS at Addenbrooke’s Hospital in Cambridge is reducing the amount of time cancer patients wait for radiotherapy treatment. </p>&#13; </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.cuh.nhs.uk/" target="_blank">CUH</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Dr Raj Jena</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 27 Jun 2023 08:00:18 +0000 Anonymous 240251 at World first for AI and machine learning to treat COVID-19 patients worldwide /research/news/world-first-for-ai-and-machine-learning-to-treat-covid-19-patients-worldwide <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/3408162.jpg?itok=Os3vf-6V" alt="A clinician helping a COVID-19 patient with an oxygen mask in a hospital in Iran" title="A clinician helping a COVID-19 patient with an oxygen mask in a hospital in Iran, Credit: Mehrnews.com" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽research was sparked by the pandemic and set out to build an AI tool to predict how much extra oxygen a COVID-19 patient may need in the first days of hospital care, using data from across four continents. </p> <p> ֱ̽technique, known as federated learning, used an algorithm to analyse chest x-rays and electronic health data from hospital patients with COVID-19 symptoms. </p> <p>To maintain strict patient confidentiality, the patient data was fully anonymised and an algorithm was sent to each hospital so no data was shared or left its location.  </p> <p>Once the algorithm had ‘learned’ from the data, the analysis was brought together to build an AI tool which could predict the oxygen needs of hospital COVID-19 patients anywhere in the world.</p> <p>Published today in <em>Nature Medicine</em>, the study dubbed EXAM (for EMR CXR AI Model), is one of the largest, most diverse clinical federated learning studies to date. </p> <p>To check the accuracy of EXAM, it was tested out in a number of hospitals across five continents, including Addenbrooke’s Hospital.  ֱ̽results showed it predicted the oxygen needed within 24 hours of a patient’s arrival in the emergency department, with a sensitivity of 95 per cent and a specificity of over 88 per cent. </p> <p>“Federated learning has transformative power to bring AI innovation to the clinical workflow,” said Professor Fiona Gilbert, who led the study in Cambridge and is Honorary Consultant Radiologist at Addenbrooke’s Hospital and Chair of Radiology at the ֱ̽ of Cambridge School of Clinical Medicine. </p> <p>“Our continued work with EXAM demonstrates that these kinds of global collaborations are repeatable and more efficient, so that we can meet clinicians’ needs to tackle complex health challenges and future epidemics.”</p> <p>First author on the study, Dr Ittai Dayan, from Mass General Bingham in the US, where the EXAM algorithm was developed, said:</p> <p>“Usually in AI development, when you create an algorithm on one hospital’s data, it doesn’t work well at any other hospital. By developing the EXAM model using federated learning and objective, multimodal data from different continents, we were able to build a generalizable model that can help frontline physicians worldwide.”</p> <p>Bringing together collaborators across North and South America, Europe and Asia, the EXAM study took just two weeks of AI ‘learning’ to achieve high-quality predictions.</p> <p>“Federated Learning allowed researchers to collaborate and set a new standard for what we can do globally, using the power of AI,'' said Dr Mona G Flores, Global Head for Medical AI at NVIDIA. “This will advance AI not just for healthcare but across all industries looking to build robust models without sacrificing privacy.”</p> <p> ֱ̽outcomes of around 10,000 COVID-19 patients from across the world were analysed in the study, including 250 who came to Addenbrooke’s Hospital in the first wave of the pandemic in March/April 2020. </p> <p> ֱ̽research was supported by the National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre (BRC). </p> <p>Work on the EXAM model has continued. Mass General Brigham and the NIHR Cambridge BRC are working with NVIDIA Inception startup Rhino Health, cofounded by Dr Dayan, to run prospective studies using EXAM. </p> <p>Professor Gilbert added: “Creating software to match the performance of our best radiologists is complex, but a truly transformative aspiration. ֱ̽more we can securely integrate data from different sources using federated learning and collaboration, and have the space needed to innovate, the faster academics can make those transformative goals a reality.”</p> <p><em><strong>Reference</strong><br /> Dayan, I et al. <a href="https://www.nature.com/articles/s41591-021-01506-3">Federated learning for predicting clinical outcomes in patients with COVID-19.</a> Nat Med; 15 Sept 2021; DOI: 10.1038/s41591-021-01506-3</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Addenbrooke’s Hospital in Cambridge along with 20 other hospitals from across the world and healthcare technology leader, NVIDIA, have used artificial intelligence (AI) to predict COVID-19 patients’ oxygen needs on a global scale.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Creating software to match the performance of our best radiologists is complex, but a truly transformative aspiration</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Fiona Gilbert</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://commons.wikimedia.org/wiki/File:3408162_بخش_ویژه_کرونا_در_بیمارستان_هاجر.jpg" target="_blank">Mehrnews.com</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">A clinician helping a COVID-19 patient with an oxygen mask in a hospital in Iran</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution">Attribution</a></div></div></div> Wed, 15 Sep 2021 16:19:22 +0000 cjb250 226721 at Improved MRI scans could aid in development of arthritis treatments /research/news/improved-mri-scans-could-aid-in-development-of-arthritis-treatments <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/crop3.png?itok=lRDv6XiY" alt="3D model of a knee with osteoarthritis" title="3D model of a knee with osteoarthritis, Credit: James MacKay" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>A team of engineers, radiologists and physicians, led by the ֱ̽ of Cambridge, developed the algorithm, which builds a three-dimensional model of an individual’s knee joint in order to map where arthritis is affecting the knee. It then automatically creates ‘change maps’ which not only tell researchers whether there have been significant changes during the study but allow them to locate exactly where these are.</p> <p>There are few effective treatments for arthritis, and the technique could be a considerable boost to efforts to develop and monitor new therapies for the condition. ֱ̽<a href="https://onlinelibrary.wiley.com/doi/full/10.1002/jmri.27193">results</a> are reported in the <em>Journal of Magnetic Resonance Imaging</em>.</p> <p>Osteoarthritis is the most common form of arthritis in the UK. It develops when the articular cartilage that coats the ends of bones and allows them to glide smoothly over each other at joints, is worn down, resulting in painful, immobile joints. Currently, there is no recognised cure and the only definitive treatment is surgery for artificial joint replacement.</p> <p>Osteoarthritis is normally identified on an X-ray by a narrowing of the space between the bones of the joint due to a loss of cartilage. However, X-rays do not have enough sensitivity to detect subtle changes in the joint over time.</p> <p>“We don’t have a good way of detecting these tiny changes in the joint over time in order to see if treatments are having any effect,” said Dr James MacKay from Cambridge’s Department of Radiology, and the study’s lead author. “In addition, if we’re able to detect the early signs of cartilage breakdown in joints, it will help us understand the disease better, which could lead to new treatments for this painful condition.”</p> <p> ֱ̽current study builds on earlier work from the same team, who developed an algorithm to monitor subtle changes in arthritic joints in CT scans. Now, they are using similar techniques for MRI, which provides more complete information about the composition of tissue – not just information about the thickness of cartilage or bone.</p> <p>MRI is already widely used to diagnose joint problems, including arthritis, but manually labelling each image is time-consuming, and may be less accurate than automated or semi-automated techniques when detecting small changes over a period of months or years.</p> <p>“Thanks to the engineering expertise of our team, we now have a better way of looking at the joint,” said MacKay.</p> <p> ֱ̽technique MacKay and his colleagues from Cambridge’s Department of Engineering developed, called 3D cartilage surface mapping (3D-CaSM), was able to pick up changes over a period of six months that weren’t detected using standard X-ray or MRI techniques.</p> <p> ֱ̽researchers tested their algorithm on knee joints from bodies that had been donated for medical research, and a further study with human participants between 40 and 60 years old. All of the participants suffered from knee pain, but were considered too young for a knee replacement. Their joints were then compared with people of a similar age with no joint pain.</p> <p>“There’s a certain degree of deterioration of the joint that happens as a normal part of aging, but we wanted to make sure that the changes we were detecting were caused by arthritis,” said MacKay. “ ֱ̽increased sensitivity that 3D-CaSM provides allows us to make this distinction, which we hope will make it a valuable tool for testing the effectiveness of new therapies.”</p> <p> ֱ̽software is <a href="https://mi.eng.cam.ac.uk/Main/StradView">freely available</a> to download and can be added to existing systems. MacKay says that the algorithm can easily be added to existing workflows and that the training process for radiologists is short and straightforward. </p> <p>As part of a separate study funded by the European Union, the researchers will also be using the algorithm to test whether it can predict which patients will need a knee replacement, by detecting early signs of arthritis.</p> <p><strong><em>Reference:</em></strong><br /> <em>James W. MacKay et al. ‘<a href="https://onlinelibrary.wiley.com/doi/full/10.1002/jmri.27193">Three-dimensional Surface-based Analysis of Cartilage MRI data in Knee Osteoarthritis: Validation and Initial Clinical Application</a>.’ Journal of Magnetic Resonance Imaging (2020). DOI: 10.1002/jmri.27193</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>An algorithm that analyses MRI images and automatically detects small changes in knee joints over time could be used in the development of new treatments for arthritis.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Thanks to the engineering expertise of our team, we now have a better way of looking at the joint</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">James MacKay</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">James MacKay</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">3D model of a knee with osteoarthritis</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 09 Jun 2020 19:00:00 +0000 sc604 215312 at Harnessing AI in the fight against COVID-19 /research/news/harnessing-ai-in-the-fight-against-covid-19 <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/covid-19-ct-scan.jpg?itok=tu3tE1kd" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>An open-source <a href="https://covid19ai.maths.cam.ac.uk/">artificial intelligence (AI) tool</a>, combining chest imaging data with laboratory and clinical data, is being developed by Cambridge researchers to support the rapid diagnosis and triaging of patients with COVID-19 in the UK.</p> <p> ֱ̽team, led by Professors Carola-Bibiane Schönlieb and Evis Sala, brings together expertise in AI for imaging with expertise in radiology and clinical applications from Addenbrooke’s and Papworth Hospitals, as well as collaborators from the UK, China, Austria and Italy, to develop a prediction model that can rapidly and reliably diagnose and suggest a prognosis to doctors.</p> <p>Reverse-transcription polymerase chain reaction (RT-PCR) tests are currently the most common tool used to diagnose COVID-19, but they are only up to 70% sensitive, meaning there are up to 30% false negatives.</p> <p>While chest X-rays and CT scans provide valuable diagnostic and monitoring information that can complement laboratory and clinical data, it is a complex task typically done by radiologists, whose expertise is often in high demand. Fast and accurate diagnosis of patients in order to limit disease spread, together with the rapid determination of whether a patient is likely to recover, require intensive care unit (ICU) admission, or intensive ventilation, is key to allocating resources and to improving patient outcomes.</p> <p>“AI offers huge potential to support agile clinical decision making, ensuring patients receive the most appropriate support and leading to better patient outcomes,” said Sala, who is based at the Department of Radiology.</p> <p>Recent <a href="https://covid19ai.maths.cam.ac.uk/news-results">studies</a> have suggested that using AI could have a meaningful impact on the management of patients with COVID-19. AI tools such as deep learning can offer automated image analysis and integration with clinical data to help clinicians make more informed decisions for treatment.</p> <p>However, good quality data and computing power are required to train and optimise predictive AI models and data availability is a major bottleneck when developing new systems. Coupled with this, the lack of standardisation of datasets makes it challenging to reuse existing AI tools in a different country than the one that it was trained for. Most current AI tools have been developed on small, locally collected datasets. Data that is being collected in hospitals all over the world varies in what is being collected and how the data is processed. Therefore, an effort for developing a widely applicable tool for COVID-19 hospital support must be open source so it can be adapted to different environments; be based on a serious data sharing and data curation, data cleaning and standardisation effort; and be developed with mathematical, statistical and engineering expertise to develop robust and translatable tools.</p> <p>To address these challenges, the team from the Cambridge Centre for Mathematical Imaging in Healthcare (CMIH) is developing a flexible, open-source AI tool that could be used by hospitals worldwide. Drawing on their history of global research collaboration and expertise in data governance, the team is gathering datasets from Austria, China, Italy and the UK for their work. Data scientists and clinicians are working in close collaboration, following standard protocols to identify bias during development. “Rigorous mathematical models play a key role in mitigating bias and improving the efficacy of the prediction model as they follow universal rules with mathematical guarantees,” said Schönlieb.</p> <p>Using deep learning approaches along with mathematical and statistical analysis methods, the new tool will be accompanied by a comprehensive algorithmic strategy that will allow fine-tuning for datasets with different characteristics and implementation in different countries. ֱ̽team are hoping to launch the AIX-COV-NET tool within the next 12 to 18 months. ֱ̽project has recently received funding from the EU-funded Innovative Medicines Initiative and Intel.</p> <p>“Our team’s strength is the close dialogue we have between clinicians and data scientists, and the passion we all bring for advancing AI solutions for COVID-19,” said Schönlieb, from Cambridge's Department of Applied Mathematics and Theoretical Physics.</p> <p>“AI offers huge potential to support agile clinical decision making, ensuring patients receive the most appropriate support and leading to better patient outcomes,” said Sala.</p> <p> ֱ̽core project team is comprised of data scientists and clinicians from across Cambridge and is led by Professor Carola-Bibiane Schönlieb, Director of the Centre for Mathematical Imaging in Healthcare, and Professor Evis Sala, Professor of Oncological Imaging, ֱ̽ of Cambridge &amp; Honorary Consultant Radiologist, Addenbrooke’s Hospital, Cambridge ֱ̽ Hospitals NHS Foundation Trust. ֱ̽team is supported by an AI and image analysis team, drawn from subject experts across Cambridge and around the world, a clinical team comprised of colleagues from hospitals in Cambridge, London and Vienna, and a support team based in the Faculty of Mathematics. Partner institutions include hospitals in Wuhan, China; Milan, Italy; and Madrid, Spain, and universities in Manchester, Vienna and London.</p> <p> ֱ̽ ֱ̽ of Cambridge has an impressive record of achievement in multidisciplinary research and innovation. <a href="https://www.cmih.maths.cam.ac.uk/"> ֱ̽CMIH</a> is a collaboration between mathematics, engineering, physics and biomedical scientists and clinicians and is one of five centres to receive investment from the Engineering and Physical Sciences Research Council (EPSRC). A key aim of this partnership is the delivery of high quality, multidisciplinary research that will help overcome some of the major challenges facing the NHS.</p> <p> </p> <h2>How you can support Cambridge's COVID-19 research effort</h2> <p><a href="https://www.philanthropy.cam.ac.uk/give-to-cambridge/cambridge-covid-19-research-fund" title="Link: Make a gift to support COVID-19 research at the ֱ̽">Donate to support COVID-19 research at Cambridge</a></p> <p> </p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>AI assisted COVID-19 diagnostic and prognostic tool could improve resource allocation and patient outcomes.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">AI offers huge potential to support agile clinical decision making, ensuring patients receive the most appropriate support and leading to better patient outcomes</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Evis Sala</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 04 Jun 2020 01:00:00 +0000 sc604 215102 at Magnetised molecules used to monitor breast cancer /research/news/magnetised-molecules-used-to-monitor-breast-cancer <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/exampleleftprotoncentrepyruvaterightlactatenotext.jpg?itok=PmUlZxgi" alt="Left: Anatomic MR image of breast tumour; Right: Overlays of hyperpolarised 13C-MRI on anatomic images showing pyruvate and lactate in breast cancer" title="Left: Anatomic MR image of breast tumour; Right: Overlays of hyperpolarised 13C-MRI on anatomic images showing pyruvate and lactate in breast cancer, Credit: Kevin Brindle" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>This is the first time researchers have demonstrated that this scanning technique, called carbon-13 hyperpolarised imaging, can be used to monitor breast cancer.</p>&#13; &#13; <p> ֱ̽team based at the Cancer Research UK Cambridge Institute and the Department of Radiology, ֱ̽ of Cambridge, tested the technique in seven patients with various types and grades of breast cancer before they had received any treatment.</p>&#13; &#13; <p>They used the scan to measure how fast the patients’ tumours were metabolising a naturally occurring molecule called pyruvate, and were able to detect differences in the size, type and grade of tumours – a measure of how fast growing, or aggressive the cancer is.</p>&#13; &#13; <p> ֱ̽scan also revealed in more detail the ‘topography’ of the tumour, detecting variations in metabolism between different regions of the same tumour.</p>&#13; &#13; <p>Professor Kevin Brindle, lead researcher from the institute, said: “This is one of the most detailed pictures of the metabolism of a patient’s breast cancer that we’ve ever been able to achieve. It’s like we can see the tumour ‘breathing’.</p>&#13; &#13; <p>“Combining this with advances in genetic testing, this scan could in the future allow doctors to better tailor treatments to each individual, and detect whether patients are responding to treatments, like chemotherapy, earlier than is currently possible”.</p>&#13; &#13; <p>Hyperpolarised carbon-13 pyruvate is an isotope-labelled form of the molecule that is slightly heavier than the naturally occurring pyruvate which is formed in our bodies from the breakdown of glucose and other sugars.</p>&#13; &#13; <p>In the study, the scientists ‘hyperpolarised’, or magnetised, carbon-13 pyruvate by cooling it to about one degree above absolute zero (-272°C) and exposing it to extremely strong magnetic fields and microwave radiation. ֱ̽frozen material was then thawed and dissolved into an injectable solution.</p>&#13; &#13; <p>Patients were injected with the solution and then received an MRI scan at Addenbrookes Hospital. Magnetising the carbon-13 pyruvate molecules increases the signal strength by 10,000 times so that they are visible on the scan.</p>&#13; &#13; <p> ֱ̽researchers used the scan to measure how fast pyruvate was being converted into a substance called lactate.</p>&#13; &#13; <p>Our cells convert pyruvate into lactate as part of the metabolic processes that produce energy and the building blocks for making new cells. Tumours have a different metabolism to healthy cells, and so produce lactate more quickly. This rate also varies between tumours, and between different regions of the same tumour.</p>&#13; &#13; <p> ֱ̽researchers showed that monitoring this conversion in real-time could be used to infer the type and aggressiveness of the breast cancer.</p>&#13; &#13; <p> ֱ̽team now hopes to trial this scan in larger groups of patients, to see if it can be reliably used to inform treatment decisions in hospitals.</p>&#13; &#13; <p>Breast cancer is the most common type of cancer in the UK, with around 55,000 new cases each year. 80% of people with breast cancer survive for 10 years or more, however for some subtypes, survival is much lower.</p>&#13; &#13; <p>Professor Charles Swanton, Cancer Research UK’s chief clinician, said: “This exciting advance in scanning technology could provide new information about the metabolic status of each patient’s tumour upon diagnosis, which could help doctors to identify the best course of treatment.</p>&#13; &#13; <p>“And the simple, non-invasive scan could be repeated periodically during treatment, providing an indication of whether the treatment is working. Ultimately, the hope is that scans like this could help doctors decide to switch to a more intensive treatment if needed, or even reduce the treatment dose, sparing people unnecessary side effects.”</p>&#13; &#13; <p> ֱ̽research was supported by Cancer Research UK Cambridge Institute and ֱ̽Mark Foundation for Cancer Research.</p>&#13; &#13; <p><em><strong>Reference</strong><br />&#13; Gallagher, FA et al. <a href="https://www.pnas.org/doi/10.1073/pnas.1913841117">Imaging breast cancer using hyperpolarized carbon-13 MRI.</a> PNAS; 21 Jan 2020; DOI: 10.1073/pnas.1913841117</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A new type of scan that involves magnetising molecules allows doctors to see in real-time which regions of a breast tumour are active, according to research at the ֱ̽ of Cambridge and published in <em>Proceedings of the National Academy of Sciences</em>.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">This is one of the most detailed pictures of the metabolism of a patient’s breast cancer that we’ve ever been able to achieve. It’s like we can see the tumour ‘breathing’</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Kevin Brindle</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Kevin Brindle</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Left: Anatomic MR image of breast tumour; Right: Overlays of hyperpolarised 13C-MRI on anatomic images showing pyruvate and lactate in breast cancer</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 22 Jan 2020 09:07:12 +0000 Anonymous 210722 at