ֱ̽ of Cambridge - augmented reality /taxonomy/subjects/augmented-reality en New augmented reality head-mounted display offers unrivalled viewing experience /research/news/new-augmented-reality-head-mounted-display-offers-unrivalled-viewing-experience <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/crop1_7.jpg?itok=k0mwcO4Q" alt="Three-dimensional augmented reality image" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽device has an enlarged eye-box that is scalable and an increased field of view of 36º that is designed for a comfortable viewing experience. It displays images on the retina using pixel beam scanning which ensures the image stays in focus regardless of the distance that the user is fixating on. <a href="https://spj.science.org/doi/10.34133/2019/9273723?permanently=true">Details</a> are reported in the journal <em>Research</em>.</p> <p>Developed by researchers at the <a href="https://www.cape.eng.cam.ac.uk/">Centre for Advanced Photonics and Electronics (CAPE)</a> in collaboration with Huawei European Research Centre, in Munich, the HMD uses partially reflective beam splitters to form an additional ‘exit pupil’ (a virtual opening through which light travels). This, together with narrow pixel beams that travel parallel to each other, and which do not disperse in other directions, produces a high-quality image that remains unaffected by changes in eye focus.</p> <p> ֱ̽results of a subjective user study conducted with more than 50 participants aged between 16 and 60<sup>1</sup> showed the 3D effect to be ‘very convincing’ for objects from 20 cm to 10 m; the images and videos to be of ‘vivid colour’ and high contrast with no observable pixels; and crucially, none of the participants reported any eyestrain or nausea, even after prolonged periods of usage over a few hours or even all day.</p> <p> ֱ̽HMD is of high brightness and suited to a wide range of indoor and outdoor uses. Further research is progressing on exploring its potential use in areas of different applications such as training, CAD (computer-aided design) development, hospitality, data manipulation, outdoor sport, defence applications and construction, as well as miniaturising the current head-mounted prototype to a glasses-based format.</p> <p><a href="https://www.eng.cam.ac.uk/profiles/dpc31">Professor Daping Chu</a>, Director of the <a href="https://www.cpds.eng.cam.ac.uk/">Centre for Photonic Devices and Sensors</a> and Director of CAPE, who led the study, said: “Our research offers up a wearable AR experience that rivals the market leaders thanks to its comfortable 3D viewing which causes no nausea or eyestrain to the user. It can deliver high-quality clear images directly on the retina, even if the user is wearing glasses. This can help the user to see displayed real-world and virtual objects clearly in an immersive environment, regardless of the quality of the user’s vision."</p> <p><strong><em>Reference: </em></strong><br /> <em>Pawan K. Shrestha, Matt J. Pryn, Jia Jia, et al. ‘Accommodation-Free Head Mounted Display with Comfortable 3D Perception and an Enlarged Eye-box.’ Research (2019). DOI: </em><em><a href="https://doi.org/10.34133/2019/9273723">10.34133/2019/9273723</a></em></p> <p><sup>1</sup>Participants comprised of industrial representatives and academic researchers familiar with 3D display technology.</p> <p><em><a href="https://www.eng.cam.ac.uk/news/new-augmented-reality-head-mounted-display-offers-unrivalled-viewing-experience">Originally published</a> on the Department of Engineering website.</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge engineers have developed a new augmented reality (AR) head-mounted display (HMD) that delivers a realistic 3D viewing experience, without the commonly associated side effects of nausea or eyestrain.</p> </p></div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-151822" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/151822">New augmented reality head mounted display offers unrivalled viewing experience</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/GPahi9_asYk?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Sun, 22 Sep 2019 14:00:00 +0000 Anonymous 207722 at Cambridge researchers and Jaguar Land Rover develop immersive 3D head-up display for in-car use /research/news/cambridge-researchers-and-jaguar-land-rover-develop-immersive-3d-head-up-display-for-in-car-use <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/jaguarimage3.gif?itok=XUw8CCbl" alt="" title="Artist&amp;#039;s impression of head-up display in Jaguar, Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Engineers are working on a powerful new 3D head-up display to project safety alerts, such as lane departure, hazard detection, sat nav directions, and to reduce the effect of poor visibility in poor weather or light conditions. Augmented reality would add the perception of depth to the image by mapping the messages directly onto the road ahead.</p> <p><a href="https://elib.uni-stuttgart.de/handle/11682/8868">Studies conducted in Germany</a> show that the use of stereoscopic 3D displays in an automotive setting can improve reaction times on ‘popping-out’ instructions and increase depth judgments while driving.</p> <p>In the future, the innovative technology could be used by passengers to watch 3D movies. Head- and eye-tracking technology would follow the user’s position to ensure they can see 3D pictures without the need for individual screens or shutter glasses worn at the cinema.</p> <p>In a fully autonomous future, the 3D displays would offer users a personalised experience and allow ride-sharers to independently select their own content. Several passengers sharing a journey would be able to enjoy their own choice of media – including journey details, points of interest or movies – optimised for where they are sitting.  </p> <p> ֱ̽research – undertaken in partnership with the Centre for Advanced Photonics and Electronics (CAPE) at ֱ̽ of Cambridge – is focused on developing an immersive head-up display, which will closely match real-life experience allowing drivers to react more naturally to hazards and prompts.</p> <p>Valerian Meijering, Human Machine Interface &amp; Head-Up Display Researcher for Jaguar Land Rover, said: “Development in virtual and augmented reality is moving really quickly. This consortium takes some of the best technology available and helps us to develop applications suited to the automotive sector.”</p> <p>Professor Daping Chu, Director of Centre for Photonic Devices and Sensors and Director of the Centre for Advanced Photonics and Electronics, said: “This programme is at the forefront of development in the virtual reality space – we’re looking at concepts and components which will set the scene for the connected, shared and autonomous cars of the future. CAPE Partners are world-leading players strategically positioned in the value chain network. Their engagement provides a unique opportunity to make a greater impact on society and further enhance the business value of our enterprises.”</p> <p> ֱ̽next-generation head-up display research forms part of the development into Jaguar Land Rover’s ‘Smart Cabin’ vision: applying technologies which combine to create a personalised space inside the vehicle for driver and passengers with enhanced safety, entertainment and convenience features as part of an autonomous, shared future.</p> <p><em>Adapted from a press release by Jaguar Land Rover</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers from the ֱ̽ of Cambridge are working with Jaguar Land Rover to develop next-generation head-up display technology that could beam real-time safety information in front of the driver, and allow passengers to stream 3D movies directly from their seats as part of a shared, autonomous future.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">This programme is at the forefront of development in the virtual reality space – we’re looking at concepts and components which will set the scene for the connected, shared and autonomous cars of the future</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Daping Chu</div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Artist&#039;s impression of head-up display in Jaguar</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 20 Aug 2019 06:33:04 +0000 Anonymous 207032 at