ֱ̽ of Cambridge - virtual reality /taxonomy/subjects/virtual-reality en Free tech eliminates the fear of public speaking /stories/AI-VR-eliminates-fear-of-public-speaking <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge scientist launches free AI-enabled virtual reality platform that transforms users into skilled and confident public speakers.</p> </p></div></div></div> Fri, 14 Mar 2025 08:00:18 +0000 lw355 248763 at Machine learning gives users ‘superhuman’ ability to open and control tools in virtual reality /research/news/machine-learning-gives-users-superhuman-ability-to-open-and-control-tools-in-virtual-reality <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/screenshot-2023-11-07-163538.jpg?itok=DJaBykvi" alt="Modelling a sailboat in virtual reality." title="Modelling a sailboat in virtual reality , Credit: ֱ̽ of Cambridge" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, used machine learning to develop ‘HotGestures’ – analogous to the hot keys used in many desktop applications.</p>&#13; &#13; <p>HotGestures give users the ability to build figures and shapes in virtual reality without ever having to interact with a menu, helping them stay focused on a task without breaking their train of thought.</p>&#13; &#13; <p> ֱ̽idea of being able to open and control tools in virtual reality has been a movie trope for decades, but the researchers say that this is the first time such a ‘superhuman’ ability has been made possible. ֱ̽<a href="https://ieeexplore.ieee.org/document/10269004">results</a> are reported in the journal <em>IEEE Transactions on Visualization and Computer Graphics</em>.</p>&#13; &#13; <p>Virtual reality (VR) and related applications have been touted as game-changers for years, but outside of gaming, their promise has not fully materialised. “Users gain some qualities when using VR, but very few people want to use it for an extended period of time,” said <a href="https://pokristensson.com/">Professor Per Ola Kristensson</a> from Cambridge’s Department of Engineering, who led the research. “Beyond the visual fatigue and ergonomic issues, VR isn’t really offering anything you can’t get in the real world.”</p>&#13; &#13; <p>Most users of desktop software will be familiar with the concept of hot keys – command shortcuts such as ctrl-c to copy and ctrl-v to paste. While these shortcuts omit the need to open a menu to find the right tool or command, they rely on the user having the correct command memorised.</p>&#13; &#13; <p>“We wanted to take the concept of hot keys and turn it into something more meaningful for virtual reality – something that wouldn’t rely on the user having a shortcut in their head already,” said Kristensson, who is also co-Director of the <a href="https://www.chia.cam.ac.uk/">Centre for Human-Inspired Artificial Intelligence</a>.</p>&#13; &#13; <p>Instead of hot keys, Kristensson and his colleagues developed ‘HotGestures’, where users perform a gesture with their hand to open and control the tool they need in 3D virtual reality environments.</p>&#13; &#13; <p>For example, performing a cutting motion opens the scissor tool, and the spray motion opens the spray can tool. There is no need for the user to open a menu to find the tool they need, or to remember a specific shortcut. Users can seamlessly switch between different tools by performing different gestures during a task, without having to pause their work to browse a menu or to press a button on a controller or keyboard.</p>&#13; &#13; <p>“We all communicate using our hands in the real world, so it made sense to extend this form of communication to the virtual world,” said Kristensson.</p>&#13; &#13; <p>For the study, the researchers built a neural network gesture recognition system that can recognise gestures by performing predictions on an incoming hand joint data stream. ֱ̽system was built to recognise ten different gestures associated with building 3D models: pen, cube, cylinder, sphere, palette, spray, cut, scale, duplicate and delete.</p>&#13; &#13; <p> ֱ̽team carried out two small studies where participants used HotGestures, menu commands or a combination. ֱ̽gesture-based technique provided fast and effective shortcuts for tool selection and usage. Participants found HotGestures to be distinctive, fast, and easy to use while also complementing conventional menu-based interaction. ֱ̽researchers designed the system so that there were no false activations – the gesture-based system was able to correctly recognise what was a command and what was normal hand movement. Overall, the gesture-based system was faster than a menu-based system.</p>&#13; &#13; <p>“There is no VR system currently available that can do this,” said Kristensson. “If using VR is just like using a keyboard and a mouse, then what’s the point of using it? It needs to give you almost superhuman powers that you can’t get elsewhere.”</p>&#13; &#13; <p> ֱ̽researchers have made the source code and dataset publicly available so that designers of VR applications can incorporate it into their products.</p>&#13; &#13; <p>“We want this to be a standard way of interacting with VR,” said Kristensson. “We’ve had the tired old metaphor of the filing cabinet for decades. We need new ways of interacting with technology, and we think this is a step in that direction. When done right, VR can be like magic.”</p>&#13; &#13; <p> ֱ̽research was supported in part by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).</p>&#13; &#13; <p> </p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Zhaomou Song; John J Dudley; Per Ola Kristensson. ‘<a href="https://ieeexplore.ieee.org/document/10269004">HotGestures: Complementing Command Selection and Use with Delimiter-Free Gesture-Based Shortcuts in Virtual Reality</a>.’ IEEE Transactions on Visualization and Computer Graphics (2023). DOI: 10.1109/TVCG.2023.3320257</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a virtual reality application where a range of 3D modelling tools can be opened and controlled using just the movement of a user’s hand. </p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"> We need new ways of interacting with technology, and we think this is a step in that direction</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Per Ola Kristensson</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-215161" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/hotgestures-give-users-superhuman-ability-to-open-and-control-tools-in-virtual-reality">HotGestures give users ‘superhuman’ ability to open and control tools in virtual reality</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/3kNFvhU5ntU?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank"> ֱ̽ of Cambridge</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Modelling a sailboat in virtual reality </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 08 Nov 2023 07:44:16 +0000 sc604 243101 at Could this monster help you overcome anxiety? /stories/VR-and-anxiety <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>In 2017, Ninja Theory, advised by Cambridge academic Professor Paul Fletcher, took the gaming world by storm with Hellblade, which accurately depicted psychosis. Now the company has teamed up with one of Fletcher’s PhD students to see whether gaming might help improve people’s mental health.</p> </p></div></div></div> Fri, 29 Jul 2022 07:00:08 +0000 cjb250 233501 at New virtual reality software allows scientists to ‘walk’ inside cells /research/news/new-virtual-reality-software-allows-scientists-to-walk-inside-cells <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/crop2_9.jpg?itok=qEbmVYZZ" alt="DBScan analysis being performed a mature neuron in a typical vLUME workspace." title="DBScan analysis being performed a mature neuron in a typical vLUME workspace., Credit: Alexandre Kitching" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽software, called <em>v</em>LUME, was created by scientists at the ֱ̽ of Cambridge and 3D image analysis software company Lume VR Ltd. It allows super-resolution microscopy data to be visualised and analysed in virtual reality, and can be used to study everything from individual proteins to entire cells. <a href="https://www.nature.com/articles/s41592-020-0962-1">Details</a> are published in the journal <em>Nature Methods</em>.</p> <p>Super-resolution microscopy, which was awarded the Nobel Prize for Chemistry in 2014, makes it possible to obtain images at the nanoscale by using clever tricks of physics to get around the limits imposed by light diffraction. This has allowed researchers to observe molecular processes as they happen. However, a problem has been the lack of ways to visualise and analyse this data in three dimensions.</p> <p>“Biology occurs in 3D, but up until now it has been difficult to interact with the data on a 2D computer screen in an intuitive and immersive way,” said Dr Steven F Lee from Cambridge’s Department of Chemistry, who led the research. “It wasn’t until we started seeing our data in virtual reality that everything clicked into place.”</p> <p> ֱ̽<em>v</em>LUME project started when Lee and his group met with the Lume VR founders at a public engagement event at the Science Museum in London. While Lee’s group had expertise in super-resolution microscopy, the team from Lume specialised in spatial computing and data analysis, and together they were able to develop <em>v</em>LUME into a powerful new tool for exploring complex datasets in virtual reality.</p> <p>“<em>v</em>LUME is revolutionary imaging software that brings humans into the nanoscale,” said Alexandre Kitching, CEO of Lume. “It allows scientists to visualise, question and interact with 3D biological data, in real time all within a virtual reality environment, to find answers to biological questions faster. It’s a new tool for new discoveries.”</p> <p>Viewing data in this way can stimulate new initiatives and ideas. For example, Anoushka Handa – a PhD student from Lee’s group – used the software to image an immune cell taken from her own blood, and then stood inside her own cell in virtual reality. “It’s incredible - it gives you an entirely different perspective on your work,” she said.</p> <p><img alt="" src="/sites/www.cam.ac.uk/files/inner-images/crop_1_0.jpg" style="width: 590px; height: 285px; float: left;" /></p> <p> ֱ̽software allows multiple datasets with millions of data points to be loaded in and finds patterns in the complex data using in-built clustering algorithms. These findings can then be shared with collaborators worldwide using image and video features in the software.</p> <p>“Data generated from super-resolution microscopy is extremely complex,” said Kitching. “For scientists, running analysis on this data can be very time-consuming. With <em>v</em>LUME, we have managed to vastly reduce that wait time allowing for more rapid testing and analysis.”</p> <p> ֱ̽team is mostly using <em>v</em>LUME with biological datasets, such as neurons, immune cells or cancer cells. For example, Lee’s group has been studying how antigen cells trigger an immune response in the body. “Through segmenting and viewing the data in vLUME, we’ve quickly been able to rule out certain hypotheses and propose new ones,” said Lee. This software allows researchers to explore, analyse, segment and share their data in new ways. All you need is a VR headset.”</p> <p><strong><em>Reference:</em></strong><br /> <em>Alexander Spark et al. ‘<a href="https://www.nature.com/articles/s41592-020-0962-1">vLUME: 3D Virtual Reality for Single-molecule Localization Microscopy</a>.’ Nature Methods (2020). DOI: 10.1038/s41592-020-0962-1</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Virtual reality software which allows researchers to ‘walk’ inside and analyse individual cells could be used to understand fundamental problems in biology and develop new treatments for disease.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Biology occurs in 3D, but up until now it has been difficult to interact with the data on a 2D computer screen in an intuitive and immersive way</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Steven Lee</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Alexandre Kitching</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">DBScan analysis being performed a mature neuron in a typical vLUME workspace.</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 12 Oct 2020 15:00:15 +0000 sc604 218682 at Virtual reality can spot navigation problems in early Alzheimer’s disease /research/news/virtual-reality-can-spot-navigation-problems-in-early-alzheimers-disease <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/env1cone1.jpg?itok=x-RzCW7j" alt="" title="Example environment from the virtual reality display , Credit: ֱ̽ of Cambridge" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽study highlights the potential of new technologies to help diagnose and monitor conditions such as Alzheimer’s disease, which affects more than 525,000 people in the UK. </p> <p>In 2014, Professor John O’Keefe of UCL was jointly awarded the Nobel Prize in Physiology or Medicine for ‘discoveries of cells that constitute a positioning system in the brain’. Essentially, this means that the brain contains a mental ‘satnav’ of where we are, where we have been, and how to find our way around.</p> <p>A key component of this internal satnav is a region of the brain known as the entorhinal cortex. This is one of the first regions to be damaged in Alzheimer’s disease, which may explain why ‘getting lost’ is one of the first symptoms of the disease. However, the pen-and-paper cognitive tests used in clinic to diagnose the condition are unable to test for navigation difficulties.</p> <p>In collaboration with Professor Neil Burgess at UCL, a team of scientists at the Department of Clinical Neurosciences at the ֱ̽ of Cambridge led by Dr Dennis Chan, previously Professor O’Keefe’s PhD student, developed and trialled a VR navigation test in patients at risk of developing dementia. ֱ̽results of their study are published today in the journal Brain.</p> <p>In the test, a patient dons a VR headset and undertakes a test of navigation while walking within a simulated environment. Successful completion of the task requires intact functioning of the entorhinal cortex, so Dr Chan’s team hypothesised that patients with early Alzheimer’s disease would be disproportionately affected on the test.</p> <p> ֱ̽team recruited 45 patients with mild cognitive impairment (MCI) from the Cambridge ֱ̽ Hospitals NHS Trust Mild Cognitive Impairment and Memory Clinics, supported by the Windsor Research Unit at Cambridgeshire and Peterborough NHS Foundation Trust. Patients with MCI typically exhibit memory impairment, but while MCI can indicate early Alzheimer’s, it can also be caused by other conditions such as anxiety and even normal aging. As such, establishing the cause of MCI is crucial for determining whether affected individuals are at risk of developing dementia in the future.  </p> <p> ֱ̽researchers took samples of cerebrospinal fluid (CSF) to look for biomarkers of underlying Alzheimer’s disease in their MCI patients, with 12 testing positive. ֱ̽researchers also recruited 41 age-matched healthy controls for comparison.</p> <p>All of the patients with MCI performed worse on the navigation task than the healthy controls. However, the study yielded two crucial additional observations. First, MCI patients with positive CSF markers – indicating the presence of Alzheimer’s disease, thus placing them at risk of developing dementia – performed worse than those with negative CSF markers at low risk of future dementia.</p> <p>Secondly, the VR navigation task was better at differentiating between these low and high risk MCI patients than a battery of currently-used tests considered to be gold standard for the diagnosis of early Alzheimer’s.</p> <p>“These results suggest a VR test of navigation may be better at identifying early Alzheimer’s disease than tests we use at present in clinic and in research studies,” says Dr Chan.</p> <p>VR could also help clinical trials of future drugs aimed at slowing down, or even halting, progression of Alzheimer’s disease. Currently, the first stage of drug trials involves testing in animals, typically mouse models of the disease. To determine whether treatments are effective, scientists study their effect on navigation using tests such as a water maze, where mice have to learn the location of hidden platforms beneath the surface of opaque pools of water. If new drugs are found to improve memory on this task, they proceed to trials in human subjects, but using word and picture memory tests. This lack of comparability of memory tests between animal models and human participants represents a major problem for current clinical trials.</p> <p>“ ֱ̽brain cells underpinning navigation are similar in rodents and humans, so testing navigation may allow us to overcome this roadblock in Alzheimer’s drug trials and help translate basic science discoveries into clinical use,” says Dr Chan. “We’ve wanted to do this for years, but it’s only now that VR technology has evolved to the point that we can readily undertake this research in patients.”</p> <p>In fact, Dr Chan believes technology could play a crucial role in diagnosing and monitoring Alzheimer’s disease. He is working with Professor Cecilia Mascolo at Cambridge’s Centre for Mobile, Wearable Systems and Augmented Intelligence to develop apps for detecting the disease and monitoring its progression. These apps would run on smartphones and smartwatches. As well as looking for changes in how we navigate, the apps will track changes in other everyday activities such as sleep and communication.</p> <p>“We know that Alzheimer’s affects the brain long before symptoms become apparent,” says Dr Chan. “We’re getting to the point where everyday tech can be used to spot the warning signs of the disease well before we become aware of them.</p> <p>“We live in a world where mobile devices are almost ubiquitous, and so app-based approaches have the potential to diagnose Alzheimer’s disease at minimal extra cost and at a scale way beyond that of brain scanning and other current diagnostic approaches.”</p> <p> ֱ̽VR research was funded by the Medical Research Council and the Cambridge NIHR Biomedical Research Centre. ֱ̽app-based research is funded by the Wellcome, the European Research Council and the Alan Turing Institute.</p> <p><em><strong>Reference</strong><br /> Howett, D, Castegnaro, A, et al. <a href="https://academic.oup.com/brain/article-lookup/doi/10.1093/brain/awz116">Differentiation of mild cognitive impairment using an entorhinal cortex based test of VR navigation.</a> Brain; 28 May 2019; DOI: 10.1093/brain/awz116</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Virtual reality (VR) can identify early Alzheimer’s disease more accurately than ‘gold standard’ cognitive tests currently in use, suggests new research from the ֱ̽ of Cambridge.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We’ve wanted to do this for years, but it’s only now that virtual reality technology has evolved to the point that we can readily undertake this research in patients</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Dennis Chan</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank"> ֱ̽ of Cambridge</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Example environment from the virtual reality display </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 23 May 2019 23:19:36 +0000 cjb250 205502 at