ֱ̽ of Cambridge - Daping Chu /taxonomy/people/daping-chu en Stackable ‘holobricks’ can make giant 3D images /research/news/stackable-holobricks-can-make-giant-3d-images <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/toytrain1.jpg?itok=547VD_Lw" alt="Reconstructed holographic images of a toy train with holobricks and original image captured by a camera" title="Reconstructed holographic images of a toy train (top) with holobricks and original image captured by a camera (bottom), Credit: CAPE" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge and Disney Research, developed a holobrick proof-of-concept, which can tile holograms together to form a large seamless 3D image. This is the first time this technology has been demonstrated and opens the door for scalable holographic 3D displays. ֱ̽<a href="https://www.nature.com/articles/s41377-022-00742-7">results</a> are reported in the journal <em>Light: Science &amp; Applications</em>.</p>&#13; &#13; <p>As technology develops, people want high-quality visual experiences, from 2D high-resolution TV to 3D holographic augmented or virtual reality, and large true 3D displays. These displays need to support a significant amount of data flow: for a 2D full HD display, the information data rate is about three gigabits per second (Gb/s), but a 3D display of the same resolution would require a rate of three terabits per second, which is not yet available.</p>&#13; &#13; <p>Holographic displays can reconstruct high-quality images for a real 3D visual perception. They are considered the ultimate display technology to connect the real and virtual worlds for immersive experiences.</p>&#13; &#13; <p>“Delivering an adequate 3D experience using the current technology is a huge challenge,” said Professor Daping Chu from Cambridge’s Department of Engineering, who led the research. “Over the past ten years, we’ve been working with our industrial partners to develop holographic displays which allow the simultaneous realisation of large size and large field-of-view, which needs to be matched with a hologram with a large optical information content.”</p>&#13; &#13; <p>However, the information content of current holograms information is much greater than the display capabilities of current light engines, known as spatial light modulators, due to their limited space bandwidth product.</p>&#13; &#13; <p>For 2D displays, it’s standard practice to tile small size displays together to form one large display. ֱ̽approach being explored here is similar, but for 3D displays, which has not been done before. “Joining pieces of 3D images together is not trivial, because the final image must be seen as seamless from all angles and all depths,” said Chu, who is also Director of the Centre for Advanced Photonics and Electronics (CAPE). “Directly tiling 3D images in real space is just not possible.”</p>&#13; &#13; <p>To address this challenge, the researchers developed the holobrick unit, based on coarse integrated holographic displays for angularly tiled 3D images, a concept developed at CAPE with Disney Research about seven years ago.</p>&#13; &#13; <p>Each of the holobricks uses a high-information bandwidth spatial light modulator for information delivery in conjunction with coarse integrated optics, to form the angularly tiled 3D holograms with large viewing areas and fields of view.</p>&#13; &#13; <p>Careful optical design makes sure the holographic fringe pattern fills the entire face of the holobrick, so that multiple holobricks can be seamlessly stacked to form a scalable spatially tiled holographic image 3D display, capable of both wide field-of-view angle and large size.</p>&#13; &#13; <p> ֱ̽proof-of-concept developed by the researchers is made of two seamlessly tiled holobricks. Each full-colour brick is 1024×768 pixels, with a 40° field of view and 24 frames per second, to display tiled holograms for full 3D images.</p>&#13; &#13; <p>“There are still many challenges ahead to make ultra-large 3D displays with wide viewing angles, such as a holographic 3D wall,” said Chu. “We hope that this work can provide a promising way to tackle this issue based on the currently limited display capability of spatial light modulators.”</p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Jin Li; Quinn Smithwick; Daping Chu. ‘<a href="https://www.nature.com/articles/s41377-022-00742-7">Holobricks: Modular Coarse Integral Holographic Displays.</a>’ Light: Science &amp; Applications (2022). DOI: 10.1038/s41377-022-00752-7</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a new method to display highly realistic holographic images using ‘holobricks’ that can be stacked together to generate large-scale holograms.</p>&#13; </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">CAPE</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Reconstructed holographic images of a toy train (top) with holobricks and original image captured by a camera (bottom)</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 16 Mar 2022 00:53:32 +0000 sc604 230551 at New augmented reality head-mounted display offers unrivalled viewing experience /research/news/new-augmented-reality-head-mounted-display-offers-unrivalled-viewing-experience <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/crop1_7.jpg?itok=k0mwcO4Q" alt="Three-dimensional augmented reality image" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽device has an enlarged eye-box that is scalable and an increased field of view of 36º that is designed for a comfortable viewing experience. It displays images on the retina using pixel beam scanning which ensures the image stays in focus regardless of the distance that the user is fixating on. <a href="https://spj.science.org/doi/10.34133/2019/9273723?permanently=true">Details</a> are reported in the journal <em>Research</em>.</p> <p>Developed by researchers at the <a href="https://www.cape.eng.cam.ac.uk/">Centre for Advanced Photonics and Electronics (CAPE)</a> in collaboration with Huawei European Research Centre, in Munich, the HMD uses partially reflective beam splitters to form an additional ‘exit pupil’ (a virtual opening through which light travels). This, together with narrow pixel beams that travel parallel to each other, and which do not disperse in other directions, produces a high-quality image that remains unaffected by changes in eye focus.</p> <p> ֱ̽results of a subjective user study conducted with more than 50 participants aged between 16 and 60<sup>1</sup> showed the 3D effect to be ‘very convincing’ for objects from 20 cm to 10 m; the images and videos to be of ‘vivid colour’ and high contrast with no observable pixels; and crucially, none of the participants reported any eyestrain or nausea, even after prolonged periods of usage over a few hours or even all day.</p> <p> ֱ̽HMD is of high brightness and suited to a wide range of indoor and outdoor uses. Further research is progressing on exploring its potential use in areas of different applications such as training, CAD (computer-aided design) development, hospitality, data manipulation, outdoor sport, defence applications and construction, as well as miniaturising the current head-mounted prototype to a glasses-based format.</p> <p><a href="https://www.eng.cam.ac.uk/profiles/dpc31">Professor Daping Chu</a>, Director of the <a href="https://www.cpds.eng.cam.ac.uk/">Centre for Photonic Devices and Sensors</a> and Director of CAPE, who led the study, said: “Our research offers up a wearable AR experience that rivals the market leaders thanks to its comfortable 3D viewing which causes no nausea or eyestrain to the user. It can deliver high-quality clear images directly on the retina, even if the user is wearing glasses. This can help the user to see displayed real-world and virtual objects clearly in an immersive environment, regardless of the quality of the user’s vision."</p> <p><strong><em>Reference: </em></strong><br /> <em>Pawan K. Shrestha, Matt J. Pryn, Jia Jia, et al. ‘Accommodation-Free Head Mounted Display with Comfortable 3D Perception and an Enlarged Eye-box.’ Research (2019). DOI: </em><em><a href="https://doi.org/10.34133/2019/9273723">10.34133/2019/9273723</a></em></p> <p><sup>1</sup>Participants comprised of industrial representatives and academic researchers familiar with 3D display technology.</p> <p><em><a href="https://www.eng.cam.ac.uk/news/new-augmented-reality-head-mounted-display-offers-unrivalled-viewing-experience">Originally published</a> on the Department of Engineering website.</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge engineers have developed a new augmented reality (AR) head-mounted display (HMD) that delivers a realistic 3D viewing experience, without the commonly associated side effects of nausea or eyestrain.</p> </p></div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-151822" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/151822">New augmented reality head mounted display offers unrivalled viewing experience</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/GPahi9_asYk?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Sun, 22 Sep 2019 14:00:00 +0000 Anonymous 207722 at Cambridge and Nanjing break ground on 'smart cities' Centre /news/cambridge-and-nanjing-break-ground-on-smart-cities-centre <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/news/nanjing.jpg?itok=7w62pRCX" alt="Ground breaking in Nanjing" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Cambridge Vice-Chancellor Professor Stephen J Toope joined Zhang Jinghua, Party Secretary of Nanjing City Party Committee and Nanjing Deputy Mayor Jiang Yuejian to turn the first soil at the site where the Centre's dedicated building will rise in Nanjing's Jiangbei New Area.</p> <p> ֱ̽Cambridge ֱ̽-Nanjing Centre of Technology and Innovation will establish a home for joint research and innovation in collaboration with the Chinese government, industry and China's global research universities that is dedicated to the future of creating 'smart' cities.</p> <p>"Here in Nanjing, an ancient city and former imperial capital, we are embarking on a unique enterprise," Vice-Chancellor Toope said at the groundbreaking ceremony. " ֱ̽innovations emerging from this Centre will enable the development of 'smart' cities in which sensors can enable sustainable lifestyles, improve healthcare, limit pollution and make efficient use of energy."</p> <p>Cambridge and its Chinese partners will share revenue derived from the commercialisation of Intellectual Property (IP) developed at the Centre. It is the ֱ̽’s first overseas enterprise at this scale.</p> <p>Funded by the Nanjing Municipality for its first five years, the project will have its own dedicated building as a pilot urban development based on high levels of technological innovation.</p> <p>At the heart of the new Centre’s activities will be research into technologies that support a modern 21st century city with integrated IT, health care and building management. Innovations emerging from the Centre will enable the development of 'smart' cities in which sensors – applied at the individual level and all the way through to the level of large infrastructure – will enable sustainable lifestyles.</p> <p>As well as supporting health and wellbeing in new cities, the new Centre will help deliver efficient energy use through its academic and entrepreneurial activities.</p> <p> ֱ̽agreement between Cambridge and Nanjing will fund positions in Nanjing, both academic and management, and will allow Cambridge-based academics to engage with specific, long-term projects in Nanjing. It will also support the establishment of a professorship, based in Cambridge, with responsibility as the Centre’s Academic Director.</p> <p> ֱ̽project has been driven by Cambridge’s Department of Engineering, although it is hoped that there will be opportunities to widen participation to other departments and Schools. IP generated by research funded through the Centre will be licensed for commercialisation by Cambridge ֱ̽’s innovation branch, Cambridge Enterprise.</p> <p> ֱ̽Centre will seek to demonstrate the power of collaboration with China’s universities, industry, government and other partners to conduct the kind of academic research of excellence today that will make life better for the city dwellers of tomorrow.</p> <p>One of the two initial projects already approved is to create a high resolution scanner that can provide a low-cost easily accessible method for examining difficult areas of the body, such as bent spines, without using large and expensive CT scans.</p> <p>That project will be led by Cambridge Engineering Department Professor Richard Prager, in collaboration with China's Southeast ֱ̽ and established local ultrasonic manufacturer Vinno.</p> <p>A second identified project led by Principal Investigator Professor Toni Vidal-Puig from Cambridge’s Clinical Biochemistry Department, will study the associated complications of increased obesity in China.</p> <p>Both themes are closely linked to the focus area of local partner, NIHA (Nanjing International Healthcare Area).  </p> <p> ֱ̽Vice-Chancellor was joined at the groundbreaking ceremony by representatives from partners Nanjing ֱ̽, Southeast ֱ̽, Peking ֱ̽. Tsinghua ֱ̽, Fudan ֱ̽ and Zheijiang ֱ̽ as well as the Academic Director of Cambridge ֱ̽-Nanjing Centre, Professor Daping Chu of Cambridge's Electrical Engineering Department, and Pro-Vice-Chancellor for International Relations Eilis Ferran.</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽ ֱ̽ of Cambridge and the Nanjing Municipal Government have broken ground on the Cambridge ֱ̽-Nanjing Centre of Technology and Innovation.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Here in Nanjing, an ancient city and former imperial capital, we are embarking on a unique enterprise.</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Vice-Chancellor Stephen J Toope</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 10 Sep 2019 04:03:44 +0000 plc32 207462 at Cambridge researchers and Jaguar Land Rover develop immersive 3D head-up display for in-car use /research/news/cambridge-researchers-and-jaguar-land-rover-develop-immersive-3d-head-up-display-for-in-car-use <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/jaguarimage3.gif?itok=XUw8CCbl" alt="" title="Artist&amp;#039;s impression of head-up display in Jaguar, Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Engineers are working on a powerful new 3D head-up display to project safety alerts, such as lane departure, hazard detection, sat nav directions, and to reduce the effect of poor visibility in poor weather or light conditions. Augmented reality would add the perception of depth to the image by mapping the messages directly onto the road ahead.</p> <p><a href="https://elib.uni-stuttgart.de/handle/11682/8868">Studies conducted in Germany</a> show that the use of stereoscopic 3D displays in an automotive setting can improve reaction times on ‘popping-out’ instructions and increase depth judgments while driving.</p> <p>In the future, the innovative technology could be used by passengers to watch 3D movies. Head- and eye-tracking technology would follow the user’s position to ensure they can see 3D pictures without the need for individual screens or shutter glasses worn at the cinema.</p> <p>In a fully autonomous future, the 3D displays would offer users a personalised experience and allow ride-sharers to independently select their own content. Several passengers sharing a journey would be able to enjoy their own choice of media – including journey details, points of interest or movies – optimised for where they are sitting.  </p> <p> ֱ̽research – undertaken in partnership with the Centre for Advanced Photonics and Electronics (CAPE) at ֱ̽ of Cambridge – is focused on developing an immersive head-up display, which will closely match real-life experience allowing drivers to react more naturally to hazards and prompts.</p> <p>Valerian Meijering, Human Machine Interface &amp; Head-Up Display Researcher for Jaguar Land Rover, said: “Development in virtual and augmented reality is moving really quickly. This consortium takes some of the best technology available and helps us to develop applications suited to the automotive sector.”</p> <p>Professor Daping Chu, Director of Centre for Photonic Devices and Sensors and Director of the Centre for Advanced Photonics and Electronics, said: “This programme is at the forefront of development in the virtual reality space – we’re looking at concepts and components which will set the scene for the connected, shared and autonomous cars of the future. CAPE Partners are world-leading players strategically positioned in the value chain network. Their engagement provides a unique opportunity to make a greater impact on society and further enhance the business value of our enterprises.”</p> <p> ֱ̽next-generation head-up display research forms part of the development into Jaguar Land Rover’s ‘Smart Cabin’ vision: applying technologies which combine to create a personalised space inside the vehicle for driver and passengers with enhanced safety, entertainment and convenience features as part of an autonomous, shared future.</p> <p><em>Adapted from a press release by Jaguar Land Rover</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers from the ֱ̽ of Cambridge are working with Jaguar Land Rover to develop next-generation head-up display technology that could beam real-time safety information in front of the driver, and allow passengers to stream 3D movies directly from their seats as part of a shared, autonomous future.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">This programme is at the forefront of development in the virtual reality space – we’re looking at concepts and components which will set the scene for the connected, shared and autonomous cars of the future</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Daping Chu</div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Artist&#039;s impression of head-up display in Jaguar</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 20 Aug 2019 06:33:04 +0000 Anonymous 207032 at Smart glass goes from clear to opaque and back again – 27 million times /research/features/smart-glass-goes-from-clear-to-opaque-and-back-again-27-million-times <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/crop.gif?itok=POxADdiA" alt="" title="Glass window with panels of Smectic A, Credit: Daping Chu" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Imagine a glass skyscraper in which all of the windows could go from clear to opaque at the flick of a switch, allowing occupants to regulate the amount of sunlight coming through the windows without having to rely on costly air conditioning or other artificial methods of temperature control.</p> <p>Researchers at the ֱ̽ of Cambridge have developed a type of ‘smart’ glass that switches back and forth between transparent and opaque, while using very low amounts of energy. ֱ̽material, known as Smectic A composites, could be used in buildings, automotive or display applications.</p> <p>Working with industrial partners including Dow Corning, the Cambridge researchers have been developing ‘Smectic A’ composites over the past two decades. ֱ̽team, based at the <a href="http://www-cape.eng.cam.ac.uk/">Centre for Advanced Photonics and Electronics</a> (CAPE), has made samples of Smectic A based glass, and is also able to produce it on a roll-to-roll process so that it can be printed onto plastic. It can be switched back and forth from transparent to opaque millions of times, and can be kept in either state for as long as the user wants.</p> <p><img alt="" src="/system/files/untitled2.gif" /></p> <p>“In addition to going back and forth between clear and opaque, we can also do different levels of opacity, so for example, you could have smart windows in an office building that automatically became more or less opaque, depending on the amount of sunlight coming through,” said Professor Daping Chu of CAPE, one of the developers of Smectic A technology.</p> <p> ֱ̽main component of the developed composite material is made up of a type of liquid crystal known as a ‘smectic’ liquid crystal, which is different than a solid crystal or a liquid.</p> <p> ֱ̽simplest definition of a crystal is a solid in which the atoms form a distinct spatial order. A liquid crystal, such as those that are used in many televisions, flows like a liquid, but has some order in its arrangements of molecules. ֱ̽liquid crystals used in televisions are known as nematic crystals, where the molecules are lined up in the same direction, but are otherwise randomly arranged.</p> <p>In a smectic liquid crystal, the molecules have a similar directional ordering, but they are also arranged in stacked layers which confines the movement of ionic additives. When a voltage is applied, the liquid crystal molecules all try to align themselves with the electric field, and the material they are embedded in (glass or plastic) will appear transparent.</p> <p>When the direction of the voltage is slowly changed, the ionic additives disrupt the layer structure of the smectic liquid crystals, which has the result of making the glass or plastic panel appear milky. Increasing the frequency of the voltage causes the movement of the ionic additives to freeze out and then switches the plastic or glass panel back to transparent. These transitions happen in a fraction of a second, and when the voltage is switched off, the material will remain either transparent or opaque until the user wants it to switch again, meaning that unless the material is actively switching states, it requires no power.</p> <p>Possible applications for smectic A composites include uses in the construction, advertising and automotive industries. For example, it could be applied to glass buildings in order to regulate the amount of sunlight that could get through, or it could be used as a ‘sunroof’ in a car that could be switched back and forth between transparent and opaque.</p> <p> ֱ̽work has been patented, and is being commercialised by Cambridge Enterprise, the ֱ̽’s commercialisation arm, to a major industrial partner through a technology framework transfer agreement.</p> <p> ֱ̽original motivation behind the development of smectic A was to develop a type of low-power electronic signage, of the type commonly seen at bus stops, that would use low amounts of energy, and would not fade in bright sunlight.</p> <p> ֱ̽original form of smectic A was based on organic materials, but the newer version is silicon-based. One sample of smectic A in a lab at CAPE has been switched back and forth between opaque and transparent more than 27 million times, switching once per second for several years.</p> <p>“ ֱ̽earlier glass-based samples we produced worked well, but there was a challenge in making them in sizes larger than a metre square,” said Chu. “So we started making it in plastic, which meant we could make bigger samples, and attach it to things like windows in order to retrofit them. This would reduce the effects of solar radiation, since the energy is being scattered rather than absorbed.”</p> <p>Chu’s team is also working on other possible applications for Smectic A and related technologies, including the possibility of a transparent heat controllable window and non-intrusive public information messaging system.</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A smart material that switches back and forth between transparent and opaque could be installed in buildings or automobiles, potentially reducing energy bills by avoiding the need for costly air conditioning.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">You could have smart windows in an office building that automatically became more or less opaque, depending on the amount of sunlight coming through.</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Daping Chu</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Daping Chu</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Glass window with panels of Smectic A</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 07 Jun 2016 23:01:00 +0000 sc604 174882 at Heads up: Cambridge holographic technology adopted by Jaguar Land Rover /research/features/heads-up-cambridge-holographic-technology-adopted-by-jaguar-land-rover <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/image-crop.png?itok=oqF2gItY" alt="Head-Up Display (HUD) projects key driving information onto a small area of the windscreen." title="Head-Up Display (HUD) projects key driving information onto a small area of the windscreen., Credit: Jaguar Land Rover" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Cambridge researchers have developed a new type of head-up display for vehicles which is the first to use laser holographic techniques to project information such as speed, direction and navigation onto the windscreen so the driver doesn’t have to take their eyes off the road. ֱ̽technology – which was conceptualised in the ֱ̽’s Department of Engineering more than a decade ago – is now available on all Jaguar Land Rover vehicles. According to the researchers behind the technology, it is another step towards cars which provide a fully immersive experience, or could even improve safety by monitoring driver behaviour.</p>&#13; &#13; <p>Cars can now park for us, help us from skidding out of control, or even prevent us from colliding with other cars. Head-up displays (HUD) are one of the many features which have been incorporated into cars in recent years. Alongside the development of more sophisticated in-car technology, various companies around the world, most notably Google, are developing autonomous cars.</p>&#13; &#13; <p>“We’re moving towards a fully immersive driver experience in cars, and we think holographic technology could be a big part of that, by providing important information, or even by encouraging good driver behaviour,” said one of the technology’s developers, Professor Daping Chu of the ֱ̽’s Department of Engineering, who is also Chairman of the Centre for Advanced Photonics and Electronics (CAPE).</p>&#13; &#13; <p>CAPE was established in 2004 to enable Cambridge researchers to work in partnership with industry to translate science into new technologies and products. ֱ̽holographic HUD technology originated with Professor Bill Crossland in 2001, and was licensed to and developed by CAPE partner company Alps Electric, and then by Two Trees Photonics Ltd at Milton Keynes, in collaboration with researchers at CAPE. Products were designed by Two Trees Photonics and Alps, and manufactured by Alps for Jaguar Land Rover. ֱ̽HUD became an available option on their vehicles in September 2014.</p>&#13; &#13; <p> ֱ̽HUD technology developed at Cambridge is the first to use laser holographic techniques, which provide better colour, brightness and contrast than other systems, but in a smaller, lighter package. It provides key information to the driver without them having to take their eyes away from the road.</p>&#13; &#13; <p>But according to Chu, the technology’s potential has yet to be fully realised, and its real advantage is what it could be used for in future models. “What we really want to see is a fully 3D display which can provide much more information to the driver in a non-intrusive way – this is still a first generation piece of technology,” he said.</p>&#13; &#13; <p>For a technology that feels somewhat futuristic, HUDs actually have a long history. ֱ̽earliest HUDs were developed during the Second World War to help pilots hit their targets while manoeuvring. ֱ̽modern HUD became commonplace in military aircraft in the 1960s, in commercial aircraft in the 1970s, and in 1988, the first production car with a HUD was introduced.</p>&#13; &#13; <p>In aircraft, a typical HUD includes information such as airspeed, altitude, heading and a horizon line, with additional information such as distance to target and weapon status for military applications.</p>&#13; &#13; <p>Most of the HUDs in passenger cars display similar information as can be seen on the dashboard – speedometer and tachometer, as well as navigation information. Some models also display night vision information.</p>&#13; &#13; <p> ֱ̽commercially-available Cambridge HUD projects information which is relevant to the driver onto the windscreen, in full colour and in two dimensions. But according to Chu, this type of technology is just getting started.</p>&#13; &#13; <p>“There are three main types of information that we could integrate into future holographic head-up displays in the future,” he said. “ ֱ̽first is the type of information that’s on today’s displays, but potentially we could add other information in a non-intrusive way: for example, if the driver passes a petrol station, perhaps the price of petrol at that station could flash up in the corner – the trick is how to display the most useful information in a way that doesn’t distract the driver.</p>&#13; &#13; <p>“ ֱ̽next level of information that could be incorporated into holographic HUDs is information about the position of pedestrians, cyclists, kerbs or other vehicles; or whether the driver is on the right track. And if we move into the next level, we start thinking about how we can use this sort of technology to help encourage good driving behaviour.”</p>&#13; &#13; <p>Although it is the realm of fantasy at the moment, the sorts of things which Chu envisions for future holographic HUDs could help avoid accidents by monitoring driver behaviour. “Imagine if this technology could be used to give alerts to the driver if they were driving too fast, or getting drowsy, or were over the legal alcohol limit. You could have all of this information with an augmented reality approach – your screen is your world, really. What I want is for the driver to have an immersive experience in how they connect to the world.”</p>&#13; &#13; <p> ֱ̽sort of immersive experience which Chu predicts crosses over with the development of autonomous or driverless cars, another project which involves researchers from Cambridge’s Engineering department.</p>&#13; &#13; <p>“ ֱ̽car will evolve,” said Chu. “I’m sure in 50 years’ time, everything in cars will be controlled by computers, but it’s being developed in different directions. ֱ̽sorts of questions we’re interested in answering are around the idea of integrating critical and non-critical systems in a vehicle. When these systems are integrated, who ultimately makes the decision – the car or the driver? And in the case of disagreement, who wins?”</p>&#13; &#13; <p>Lee Skrypchuk, Human Machine Interface Technical Specialist at Jaguar Land Rover, said: “ ֱ̽development of a laser holographic HUD presented a number of technical challenges but also a number of benefits including small package size, high optical efficiency, wide colour gamut and cross platform compatibility. Incorporating a laser holographic light engine was a true world first application of technology for Jaguar Land Rover and I'm delighted that the technology has worked so well in our vehicles.”</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A ‘head-up’ display for passenger vehicles developed at Cambridge, the first to incorporate holographic techniques, has been incorporated into Jaguar Land Rover vehicles.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We’re moving towards a fully immersive driver experience in cars, and we think holographic technology could be a big part of that.</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Daping Chu</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://media.jaguarlandrover.com/en/en/en/en" target="_blank">Jaguar Land Rover</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Head-Up Display (HUD) projects key driving information onto a small area of the windscreen.</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 26 Nov 2015 08:00:00 +0000 sc604 163182 at