ֱ̽ of Cambridge - Jaguar Land Rover /taxonomy/external-affiliations/jaguar-land-rover en Using machine learning to monitor driver ‘workload’ could help improve road safety /research/news/using-machine-learning-to-monitor-driver-workload-could-help-improve-road-safety <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/gettyimages-166065769-dp.jpg?itok=Kiajf2DW" alt="Head up display of traffic information and weather as seen by the driver" title="Head up display of traffic information and weather as seen by the driver, Credit: Coneyl Jay via Getty Images" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, working in partnership with Jaguar Land Rover (JLR) used a combination of on-road experiments and machine learning as well as Bayesian filtering techniques to reliably and continuously measure driver ‘workload’. Driving in an unfamiliar area may translate to a high workload, while a daily commute may mean a lower workload.</p>&#13; &#13; <p> ֱ̽resulting algorithm is highly adaptable and can respond in near real-time to changes in the driver’s behaviour and status, road conditions, road type, or driver characteristics.</p>&#13; &#13; <p>This information could then be incorporated into in-vehicle systems such as infotainment and navigation, displays, advanced driver assistance systems (ADAS) and others. Any driver-vehicle interaction can be then customised to prioritise safety and enhance the user experience, delivering adaptive human-machine interactions. For example, drivers are only alerted at times of low workload, so that the driver can keep their full concentration on the road in more stressful driving scenarios. ֱ̽<a href="https://ieeexplore.ieee.org/document/10244092">results</a> are reported in the journal <em>IEEE Transactions on Intelligent Vehicles</em>.</p>&#13; &#13; <p>“More and more data is made available to drivers all the time. However, with increasing levels of driver demand, this can be a major risk factor for road safety,” said co-first author Dr Bashar Ahmad from Cambridge’s Department of Engineering. “There is a lot of information that a vehicle can make available to the driver, but it’s not safe or practical to do so unless you know the status of the driver.”</p>&#13; &#13; <p>A driver’s status – or workload – can change frequently. Driving in a new area, in heavy traffic or poor road conditions, for example, is usually more demanding than a daily commute.</p>&#13; &#13; <p>“If you’re in a demanding driving situation, that would be a bad time for a message to pop up on a screen or a heads-up display,” said Ahmad. “ ֱ̽issue for car manufacturers is how to measure how occupied the driver is, and instigate interactions or issue messages or prompts only when the driver is happy to receive them.”</p>&#13; &#13; <p>There are algorithms for measuring the levels of driver demand using eye gaze trackers and biometric data from heart rate monitors, but the Cambridge researchers wanted to develop an approach that could do the same thing using information that’s available in any car, specifically driving performance signals such as steering, acceleration and braking data. It should also be able to consume and fuse different unsynchronised data streams that have different update rates, including from biometric sensors if available.</p>&#13; &#13; <p>To measure driver workload, the researchers first developed a modified version of the Peripheral Detection Task to collect, in an automated way, subjective workload information during driving. For the experiment, a phone showing a route on a navigation app was mounted to the car’s central air vent, next to a small LED ring light that would blink at regular intervals. Participants all followed the same route through a mix of rural, urban and main roads. They were asked to push a finger-worn button whenever the LED light lit up in red and the driver perceived they were in a low workload scenario.</p>&#13; &#13; <p>Video analysis of the experiment, paired with the data from the buttons, allowed the researchers to identify high workload situations, such as busy junctions or a vehicle in front or behind the driver behaving unusually.</p>&#13; &#13; <p> ֱ̽on-road data was then used to develop and validate a supervised machine learning framework to profile drivers based on the average workload they experience, and an adaptable Bayesian filtering approach for sequentially estimating, in real-time, the driver’s instantaneous workload, using several driving performance signals including steering and braking. ֱ̽framework combines macro and micro measures of workload where the former is the driver’s average workload profile and the latter is the instantaneous one.</p>&#13; &#13; <p>“For most machine learning applications like this, you would have to train it on a particular driver, but we’ve been able to adapt the models on the go using simple Bayesian filtering techniques,” said Ahmad. “It can easily adapt to different road types and conditions, or different drivers using the same car.”</p>&#13; &#13; <p> ֱ̽research was conducted in collaboration with JLR who did the experimental design and the data collection. It was part of a project sponsored by JLR under the CAPE agreement with the ֱ̽ of Cambridge.</p>&#13; &#13; <p>“This research is vital in understanding the impact of our design from a user perspective, so that we can continually improve safety and curate exceptional driving experiences for our clients,” said JLR’s Senior Technical Specialist of Human Machine Interface Dr Lee Skrypchuk. “These findings will help define how we use intelligent scheduling within our vehicles to ensure drivers receive the right notifications at the most appropriate time, allowing for seamless and effortless journeys.”</p>&#13; &#13; <p> ֱ̽research at Cambridge was carried out by a team of researchers from the Signal Processing and Communications Laboratory (SigProC), Department of Engineering, under the supervision of Professor Simon Godsill. It was led by Dr Bashar Ahmad and included Nermin Caber (PhD student at the time) and Dr Jiaming Liang, who all worked on the project while based at Cambridge’s Department of Engineering.</p>&#13; &#13; <p> </p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Nermin Caber et al. ‘<a href="https://ieeexplore.ieee.org/document/10244092">Driver Profiling and Bayesian Workload Estimation Using Naturalistic Peripheral Detection Study Data</a>.’ IEEE Transactions on Intelligent Vehicles (2023). DOI: 10.1109/TIV.2023.3313419</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed an adaptable algorithm that could improve road safety by predicting when drivers are able to safely interact with in-vehicle systems or receive messages, such as traffic alerts, incoming calls or driving directions.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">There is a lot of information that a vehicle can make available to the driver, but it’s not safe or practical to do so unless you know the status of the driver</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Bashar Ahmad</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Coneyl Jay via Getty Images</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Head up display of traffic information and weather as seen by the driver</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 07 Dec 2023 07:48:29 +0000 sc604 243581 at AI-based ‘no-touch touchscreen’ could reduce risk of pathogen spread from surfaces /research/news/ai-based-no-touch-touchscreen-could-reduce-risk-of-pathogen-spread-from-surfaces <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/crop_186.jpg?itok=-t4CNKoz" alt="Predictive touch system" title="Predictive touch, Credit: Jaguar Land Rover" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽patented technology, known as ‘predictive touch’, was developed by engineers at the ֱ̽ of Cambridge as part of a research collaboration with Jaguar Land Rover. It uses a combination of artificial intelligence and sensor technology to predict a user’s intended target on touchscreens and other interactive displays or control panels, selecting the correct item before the user’s hand reaches the display.</p> <p>More and more passenger cars have touchscreen technology to control entertainment, navigation or temperature control systems. However, users can often miss the correct item – for example due to acceleration or vibrations from road conditions – and have to reselect, meaning that their attention is taken off the road, increasing the risk of an accident.</p> <p>In lab-based tests, driving simulators and road-based trials, the predictive touch technology was able to reduce interaction effort and time by up to 50% due to its ability to predict the user’s intended target with high accuracy early in the pointing task.</p> <p>As lockdown restrictions around the world continue to ease, the researchers say the technology could also be useful in a post-COVID-19 world. Many everyday consumer transactions are conducted using touchscreens: ticketing at rail stations or cinemas, ATMs, check-in kiosks at airports, self-service checkouts in supermarkets, as well as many industrial and manufacturing applications. Eliminating the need to actually touch a touchscreen or other interactive display could reduce the risk of spreading pathogens – such as the common cold, influenza or even coronavirus – from surfaces.</p> <p>In addition, the technology could also be incorporated into smartphones, and could be useful while walking or jogging, allowing users to easily and accurately select items without the need for any physical contact. It even works in situations such as a moving car on a bumpy road, or if the user has a motor disability which causes a tremor or sudden hand jerks, such as Parkinson’s disease or cerebral palsy.</p> <p>“Touchscreens and other interactive displays are something most people use multiple times per day, but they can be difficult to use while in motion, whether that’s driving a car or changing the music on your phone while you’re running,” said Professor Simon Godsill from Cambridge’s Department of Engineering, who led the project. “We also know that certain pathogens can be transmitted via surfaces, so this technology could help reduce the risk for that type of transmission.”</p> <p> ֱ̽technology uses machine intelligence to determine the item the user intends to select on the screen early in the pointing task, speeding up the interaction. It uses a gesture tracker, including vision-based or RF-based sensors, which are increasingly common in consumer electronics; contextual information such as user profile, interface design, environmental conditions; and data available from other sensors, such as an eye-gaze tracker, to infer the user’s intent in real time.</p> <p>“This technology also offers us the chance to make vehicles safer by reducing the cognitive load on drivers and increasing the amount of time they can spend focused on the road ahead. This is a key part of our Destination Zero journey,” said Lee Skrypchuk, Human Machine Interface Technical Specialist at Jaguar Land Rover.</p> <p>It could also be used for displays that do not have a physical surface such as 2D or 3D projections or holograms. Additionally, it promotes inclusive design practices and offers additional design flexibilities, since the interface functionality can be seamlessly personalised for given users and the display size or location is no longer constrained by the user ability to reach-touch.</p> <p>“Our technology has numerous advantages over more basic mid-air interaction techniques or conventional gesture recognition, because it supports intuitive interactions with legacy interface designs and doesn’t require any learning on the part of the user,” said Dr Bashar Ahmad, who led the development of the technology and the underlying algorithms with Professor Godsill. “It fundamentally relies on the system to predict what the user intends and can be incorporated into both new and existing touchscreens and other interactive display technologies.”</p> <p>This software-based solution for contactless interactions has reached high technology readiness levels and can be seamlessly integrated into existing touchscreens and interactive displays, so long as the correct sensory data is available to support the machine learning algorithm.</p> <p> ֱ̽technology was developed between 2012 and 2018 by the Centre for Advanced Photonics and Electronics (CAPE) as part of the CAPE Motion Adaptive Touchscreen System for Automotive - MATSA (1 and 2) project. </p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A ‘no-touch touchscreen’ developed for use in cars could also have widespread applications in a post-COVID-19 world, by reducing the risk of transmission of pathogens on surfaces.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Touchscreens and other interactive displays are something most people use multiple times per day, but they can be difficult to use while in motion, whether that’s driving a car or changing the music on your phone while you’re running</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Simon Godsill</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-163472" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/163472">Can Touchless screens prevent future epidemics and car accidents?</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/ODFxKRNB55E?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Jaguar Land Rover</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Predictive touch</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 23 Jul 2020 05:00:00 +0000 sc604 216492 at Cambridge researchers and Jaguar Land Rover develop immersive 3D head-up display for in-car use /research/news/cambridge-researchers-and-jaguar-land-rover-develop-immersive-3d-head-up-display-for-in-car-use <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/jaguarimage3.gif?itok=XUw8CCbl" alt="" title="Artist&amp;#039;s impression of head-up display in Jaguar, Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Engineers are working on a powerful new 3D head-up display to project safety alerts, such as lane departure, hazard detection, sat nav directions, and to reduce the effect of poor visibility in poor weather or light conditions. Augmented reality would add the perception of depth to the image by mapping the messages directly onto the road ahead.</p> <p><a href="https://elib.uni-stuttgart.de/handle/11682/8868">Studies conducted in Germany</a> show that the use of stereoscopic 3D displays in an automotive setting can improve reaction times on ‘popping-out’ instructions and increase depth judgments while driving.</p> <p>In the future, the innovative technology could be used by passengers to watch 3D movies. Head- and eye-tracking technology would follow the user’s position to ensure they can see 3D pictures without the need for individual screens or shutter glasses worn at the cinema.</p> <p>In a fully autonomous future, the 3D displays would offer users a personalised experience and allow ride-sharers to independently select their own content. Several passengers sharing a journey would be able to enjoy their own choice of media – including journey details, points of interest or movies – optimised for where they are sitting.  </p> <p> ֱ̽research – undertaken in partnership with the Centre for Advanced Photonics and Electronics (CAPE) at ֱ̽ of Cambridge – is focused on developing an immersive head-up display, which will closely match real-life experience allowing drivers to react more naturally to hazards and prompts.</p> <p>Valerian Meijering, Human Machine Interface &amp; Head-Up Display Researcher for Jaguar Land Rover, said: “Development in virtual and augmented reality is moving really quickly. This consortium takes some of the best technology available and helps us to develop applications suited to the automotive sector.”</p> <p>Professor Daping Chu, Director of Centre for Photonic Devices and Sensors and Director of the Centre for Advanced Photonics and Electronics, said: “This programme is at the forefront of development in the virtual reality space – we’re looking at concepts and components which will set the scene for the connected, shared and autonomous cars of the future. CAPE Partners are world-leading players strategically positioned in the value chain network. Their engagement provides a unique opportunity to make a greater impact on society and further enhance the business value of our enterprises.”</p> <p> ֱ̽next-generation head-up display research forms part of the development into Jaguar Land Rover’s ‘Smart Cabin’ vision: applying technologies which combine to create a personalised space inside the vehicle for driver and passengers with enhanced safety, entertainment and convenience features as part of an autonomous, shared future.</p> <p><em>Adapted from a press release by Jaguar Land Rover</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers from the ֱ̽ of Cambridge are working with Jaguar Land Rover to develop next-generation head-up display technology that could beam real-time safety information in front of the driver, and allow passengers to stream 3D movies directly from their seats as part of a shared, autonomous future.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">This programme is at the forefront of development in the virtual reality space – we’re looking at concepts and components which will set the scene for the connected, shared and autonomous cars of the future</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Daping Chu</div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Artist&#039;s impression of head-up display in Jaguar</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 20 Aug 2019 06:33:04 +0000 Anonymous 207032 at Heads up: Cambridge holographic technology adopted by Jaguar Land Rover /research/features/heads-up-cambridge-holographic-technology-adopted-by-jaguar-land-rover <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/image-crop.png?itok=oqF2gItY" alt="Head-Up Display (HUD) projects key driving information onto a small area of the windscreen." title="Head-Up Display (HUD) projects key driving information onto a small area of the windscreen., Credit: Jaguar Land Rover" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Cambridge researchers have developed a new type of head-up display for vehicles which is the first to use laser holographic techniques to project information such as speed, direction and navigation onto the windscreen so the driver doesn’t have to take their eyes off the road. ֱ̽technology – which was conceptualised in the ֱ̽’s Department of Engineering more than a decade ago – is now available on all Jaguar Land Rover vehicles. According to the researchers behind the technology, it is another step towards cars which provide a fully immersive experience, or could even improve safety by monitoring driver behaviour.</p>&#13; &#13; <p>Cars can now park for us, help us from skidding out of control, or even prevent us from colliding with other cars. Head-up displays (HUD) are one of the many features which have been incorporated into cars in recent years. Alongside the development of more sophisticated in-car technology, various companies around the world, most notably Google, are developing autonomous cars.</p>&#13; &#13; <p>“We’re moving towards a fully immersive driver experience in cars, and we think holographic technology could be a big part of that, by providing important information, or even by encouraging good driver behaviour,” said one of the technology’s developers, Professor Daping Chu of the ֱ̽’s Department of Engineering, who is also Chairman of the Centre for Advanced Photonics and Electronics (CAPE).</p>&#13; &#13; <p>CAPE was established in 2004 to enable Cambridge researchers to work in partnership with industry to translate science into new technologies and products. ֱ̽holographic HUD technology originated with Professor Bill Crossland in 2001, and was licensed to and developed by CAPE partner company Alps Electric, and then by Two Trees Photonics Ltd at Milton Keynes, in collaboration with researchers at CAPE. Products were designed by Two Trees Photonics and Alps, and manufactured by Alps for Jaguar Land Rover. ֱ̽HUD became an available option on their vehicles in September 2014.</p>&#13; &#13; <p> ֱ̽HUD technology developed at Cambridge is the first to use laser holographic techniques, which provide better colour, brightness and contrast than other systems, but in a smaller, lighter package. It provides key information to the driver without them having to take their eyes away from the road.</p>&#13; &#13; <p>But according to Chu, the technology’s potential has yet to be fully realised, and its real advantage is what it could be used for in future models. “What we really want to see is a fully 3D display which can provide much more information to the driver in a non-intrusive way – this is still a first generation piece of technology,” he said.</p>&#13; &#13; <p>For a technology that feels somewhat futuristic, HUDs actually have a long history. ֱ̽earliest HUDs were developed during the Second World War to help pilots hit their targets while manoeuvring. ֱ̽modern HUD became commonplace in military aircraft in the 1960s, in commercial aircraft in the 1970s, and in 1988, the first production car with a HUD was introduced.</p>&#13; &#13; <p>In aircraft, a typical HUD includes information such as airspeed, altitude, heading and a horizon line, with additional information such as distance to target and weapon status for military applications.</p>&#13; &#13; <p>Most of the HUDs in passenger cars display similar information as can be seen on the dashboard – speedometer and tachometer, as well as navigation information. Some models also display night vision information.</p>&#13; &#13; <p> ֱ̽commercially-available Cambridge HUD projects information which is relevant to the driver onto the windscreen, in full colour and in two dimensions. But according to Chu, this type of technology is just getting started.</p>&#13; &#13; <p>“There are three main types of information that we could integrate into future holographic head-up displays in the future,” he said. “ ֱ̽first is the type of information that’s on today’s displays, but potentially we could add other information in a non-intrusive way: for example, if the driver passes a petrol station, perhaps the price of petrol at that station could flash up in the corner – the trick is how to display the most useful information in a way that doesn’t distract the driver.</p>&#13; &#13; <p>“ ֱ̽next level of information that could be incorporated into holographic HUDs is information about the position of pedestrians, cyclists, kerbs or other vehicles; or whether the driver is on the right track. And if we move into the next level, we start thinking about how we can use this sort of technology to help encourage good driving behaviour.”</p>&#13; &#13; <p>Although it is the realm of fantasy at the moment, the sorts of things which Chu envisions for future holographic HUDs could help avoid accidents by monitoring driver behaviour. “Imagine if this technology could be used to give alerts to the driver if they were driving too fast, or getting drowsy, or were over the legal alcohol limit. You could have all of this information with an augmented reality approach – your screen is your world, really. What I want is for the driver to have an immersive experience in how they connect to the world.”</p>&#13; &#13; <p> ֱ̽sort of immersive experience which Chu predicts crosses over with the development of autonomous or driverless cars, another project which involves researchers from Cambridge’s Engineering department.</p>&#13; &#13; <p>“ ֱ̽car will evolve,” said Chu. “I’m sure in 50 years’ time, everything in cars will be controlled by computers, but it’s being developed in different directions. ֱ̽sorts of questions we’re interested in answering are around the idea of integrating critical and non-critical systems in a vehicle. When these systems are integrated, who ultimately makes the decision – the car or the driver? And in the case of disagreement, who wins?”</p>&#13; &#13; <p>Lee Skrypchuk, Human Machine Interface Technical Specialist at Jaguar Land Rover, said: “ ֱ̽development of a laser holographic HUD presented a number of technical challenges but also a number of benefits including small package size, high optical efficiency, wide colour gamut and cross platform compatibility. Incorporating a laser holographic light engine was a true world first application of technology for Jaguar Land Rover and I'm delighted that the technology has worked so well in our vehicles.”</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A ‘head-up’ display for passenger vehicles developed at Cambridge, the first to incorporate holographic techniques, has been incorporated into Jaguar Land Rover vehicles.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We’re moving towards a fully immersive driver experience in cars, and we think holographic technology could be a big part of that.</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Daping Chu</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://media.jaguarlandrover.com/en/en/en/en" target="_blank">Jaguar Land Rover</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Head-Up Display (HUD) projects key driving information onto a small area of the windscreen.</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 26 Nov 2015 08:00:00 +0000 sc604 163182 at