ֱ̽ of Cambridge - Joan Lasenby /taxonomy/people/joan-lasenby en Identification of ‘violent’ processes that cause wheezing could lead to better diagnosis and treatment for lung disease /research/news/identification-of-violent-processes-that-cause-wheezing-could-lead-to-better-diagnosis-and-treatment <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/embroideredlungs.jpg?itok=5cgJvd5t" alt="Dimensional Lungs" title="Dimensional Lungs, Credit: Hey Paul Studios" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, used modelling and high-speed video techniques to show what causes wheezing and how to predict it. Their results could be used as the basis of a cheaper and faster diagnostic for lung disease that requires just a stethoscope and a microphone.</p> <p>Improved understanding of the physical mechanism responsible for generating wheezing sounds could provide a better causal link between symptoms and disease, and help improve diagnosis and treatment. ֱ̽results are reported in the journal <em>Royal Society Open Science</em>.</p> <p>At some point, most of us have experienced wheezing, a high-pitched whistling sound made while breathing. For most people, the phenomenon is temporary and usually the result a cold or mild allergic reaction. However, regular or chronic wheezing is often a symptom of more serious conditions, such as asthma, emphysema, chronic obstructive pulmonary disease (COPD) or certain cancers.</p> <p>“Because wheezing makes it harder to breathe, it puts an enormous amount of pressure on the lungs,” said first author Dr Alastair Gregory from Cambridge’s Department of Engineering. “ ֱ̽sounds associated with wheezing have been used to make diagnoses for centuries, but the physical mechanisms responsible for the onset of wheezing are poorly understood, and there is no model for predicting when wheezing will occur.”</p> <p>Co-author Dr Anurag Agarwal, Head of the Acoustics lab in the Department of Engineering, said he first got the idea to study wheezing after a family vacation several years ago. “I started wheezing the first night we were there, which had never happened to me before,” he said. “And as an engineer who studies acoustics, my first thought was how cool it was that my body was making these noises. After a few days however, I was having real trouble breathing, which made the novelty wear off pretty quickly.”</p> <p>Agarwal’s wheezing was likely caused by a dust mite allergy, which was easily treated with over-the-counter antihistamines. However, after speaking with a neighbour who is also a specialist in respiratory medicine, he learned that even though it is a common occurrence, the physical mechanisms that cause wheezing are somewhat mysterious.</p> <p>“Since wheezing is associated with so many conditions, it is difficult to be sure of what is wrong with a patient just based on the wheeze, so we’re working on understanding how wheezing sounds are produced so that diagnoses can be more specific,” said Agarwal.</p> <p> ֱ̽airways of the lung are a branching network of flexible tubes, called bronchioles, that gradually get shorter and narrower as they get deeper into the lung.</p> <p>In order to mimic this setup in the lab, the researchers modified a piece of equipment called a Starling resistor, in which airflow is driven through thin elastic tubes of various lengths and thicknesses.</p> <p>Co-author and computer vision specialist Professor Joan Lasenby developed a multi-camera stereoscopy technique to film the air being forced through the tubes at different degrees of tension, in order to observe the physical mechanisms that cause wheezing.</p> <p>“It surprised us just how violent the mechanism of wheezing is,” said Gregory, who is also a Junior Research Fellow at Magdalene College. “We found that there are two conditions for wheezing to occur: the first is that the pressure on the tubes is such that one or more of the bronchioles nearly collapses, and the second is that air is forced though the collapsed airway with enough force to drive oscillations.”</p> <p>Once these conditions are met, the oscillations grow and are sustained by a flutter mechanism in which waves travelling from front to back have the same frequency as the opening and closing of the tube. “A similar phenomenon has been seen in aircraft wings when they fail, or in bridges when they collapse,” said Agarwal. “When up and down vibrations are at the same frequency as clockwise and anticlockwise twisting vibrations, we get flutter that causes the structure to collapse. ֱ̽same process is at work inside the respiratory system.”</p> <p>Using these observations, the researchers developed a ‘tube law’ in order to predict when this potentially damaging oscillation might occur, depending on the tube’s material properties, geometry and the amount of tension.</p> <p>“We then use this law to build a model that can predict the onset of wheezing and could even be the basis of a cheaper and faster diagnostic for lung disease,” said Gregory. “Instead of expensive and time-consuming methods such as x-rays or MRI, we wouldn’t need anything more than a microphone and a stethoscope.”</p> <p>A diagnostic based on this method would work by using a microphone – early tests were done using the in-built microphone on a normal smartphone – to record the frequency of the wheezing sound and use this to identify which bronchiole is near collapse, and whether the airways are unusually stiff or flexible in order to target treatment. ֱ̽researchers hope that by finding changes in material properties from wheezing, and locations that wheezes come from, the additional information will make it easier to distinguish between different conditions, although further work in this area is still needed.</p> <p> </p> <p><strong><em>Reference:.</em></strong><br /> <em>A. L. Gregory, A. Agarwal and J. Lasenby. ‘An Experimental Investigation to Model Wheezing in Lungs.’ Royal Society Open Science (2021). DOI: 10.1098/rsos.201951</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A team of engineers has identified the ‘violent’ physical processes at work inside the lungs which cause wheezing, a condition that affects up to a quarter of the world’s population.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Since wheezing is associated with so many conditions, it is difficult to be sure of what is wrong with a patient just based on the wheeze</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Anurag Agarwal</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://flickr.com/photos/hey__paul/14514805379/in/photolist-o7Cb5d-op6otW-o7Dhd2-o7CaXa-op7Tqp-bRJTbT-crNTLA-bCQ9N7-ekCBFd-akizxw-akizqU-akizv3-akizyN-crNTuG-bRJT6g-bRJSXp-bCQ9Dh-ak9AFq-ak6Q1x-ak9AR9-ak6PZe-ekCB1b-dmyhHU-ekwRUX-ekwRXc-ekwRKT-crNU41-dH2Zku-dNthte-dNySCq-dNthvR-dNySoY-dNyStq-dW4uru-dNExWu-dNExV1-dNExSN-dNExBj-dNExPC-dW4t2b-dW4sCm-dW4sfb-dVXTTX-dCSLqj-ec8YTo-dGWyAt-dH2ZfY-fzw4Zb-dGWyGV-dGWyDK" target="_blank">Hey Paul Studios</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Dimensional Lungs</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution">Attribution</a></div></div></div> Wed, 24 Feb 2021 00:01:31 +0000 sc604 222351 at Sports calibrated /research/features/sports-calibrated <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/150206britishcycling.jpg?itok=i0vvoblv" alt=" ֱ̽view from the top of the stands of Lee Valley VeloPark, London." title=" ֱ̽view from the top of the stands of Lee Valley VeloPark, London., Credit: British Cycling" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽bat makes contact with the ball; the ball flies back, back, back; and a thousand mobile phones capture it live as the ball soars over the fence and into the cheering crowd. Baseball is America’s pastime and, as for many other spectator sports, mobile phones have had a huge effect on the experience of spending an afternoon at the ballpark.</p> <p>But what to do with that video of a monster home run or a spectacular diving catch once the game is over? What did that same moment look like from the other end of the stadium? How many other people filmed exactly the same thing but from different vantage points? Could something useful be saved from what would otherwise be simply a sporting memory?</p> <p>Dr Joan Lasenby of the ֱ̽ of Cambridge’s Department of Engineering has been working on ways of gathering quantitative information from video, and thanks to an ongoing partnership with Google, a new method of digitally ‘reconstructing’ shared experiences such as sport or concerts is being explored at YouTube.</p> <p> ֱ̽goal is for users to upload their videos in collaboration with the event coordinator, and a ‘cloud’-based system will identify where in the space the video was taken from, creating a ‘map’ of different cameras from all over the stadium. ֱ̽user can then choose which camera they want to watch, allowing them to experience the same event from dozens or even hundreds of different angles.<img alt="" src="/sites/www.cam.ac.uk/files/inner-images/150206-british-cycling-3.gif" style="width: 250px; height: 250px; float: right;" /></p> <p>But although stitching together still images is reasonably straightforward, doing the same thing with video, especially when the distance between cameras can be on a scale as massive as a sports stadium, is much more difficult. “There’s a lot of information attached to the still images we take on our phones or cameras, such as the type of camera, the resolution, the focus, and so on,” said Lasenby. “But the videos we upload from our phones have none of that information attached, so patching them together is much more difficult.”</p> <p>Using a series of videos taken on mobile phones during a baseball game, the researchers developed a method of using visual information contained in the videos, such as a specific advertisement or other distinctive static features of the stadium, as a sort of ‘anchor’ which enables the video’s location to be pinpointed.</p> <p>“Another problem we had to look at was a way to separate the good frames from the bad,” said Dr Stuart Bennett, a postdoctoral researcher in Lasenby’s group who developed this new method of three-dimensional reconstruction while a PhD student. “With the videos you take on your phone, usually you’re not paying attention to the quality of each frame as you would with a still image. We had to develop a way of efficiently, and automatically, choosing the best frames and deleting the rest.”</p> <p>To identify where each frame originated from in the space, the technology selects the best frames automatically via measures of sharpness and edge or corner content and then selects those which match. ֱ̽system works with as few as two cameras, and the team has tested it with as many as ten. YouTube has been stress testing it further, expecting that the technology has the potential to improve fan engagement in the sports and music entertainment sectors.</p> <p>Although the technology is primarily intended for use in an entertainment context, Lasenby points out it could potentially be applied for surveillance purposes as well. “It is a possible application down the road, and could one day be used by law enforcement to help provide information at the crime scene,” said Lasenby. “At the moment, a lot of surveillance is done with fixed cameras, and you know everything about the camera. But this sort of technology might be able to give you information about what’s going on in a particular video shot on a phone by making locations in that video identifiable.”</p> <div> <br /> <p>Another area where Lasenby’s group is extracting quantitative data from video is in their partnership with British Cycling. Over the past decade, the UK has become a dominant force in international cycling, thanks to the quality of its riders and equipment, its partnerships with industry and academia, and its use of technology to help improve speeds on the track and on the road.</p> <img alt="" src="/sites/www.cam.ac.uk/files/inner-images/150206-british-cycling-4.gif" style="width: 250px; height: 250px; float: right;" /> <p>“In sport, taking qualitative videos and photographs is commonplace, which is extremely useful, as athletes aren’t robots,” said Professor Tony Purnell, Head of Technical Development for the Great Britain Cycling Team and Royal Academy of Engineering Visiting Professor at Cambridge. “But what we wanted was to start using image processing not just to gather qualitative information, but to get some good quantitative data as well.”</p> <p>Currently, elite cyclists are filmed on a turbo trainer, which is essentially a stationary bicycle in a lab or in a wind tunnel. ֱ̽resulting videos are then assessed to improve aerodynamics or help prevent injuries. “But for cyclists, especially sprinters, sitting on a constrained machine just isn’t realistic,” said Lasenby. “When you look at a sprinter on a track, they’re throwing their bikes all over the place to get even the tiniest advantage. So we thought that if we could get quantitative data from video of them actually competing, it would be much more valuable than anything we got from a stationary turbo trainer.”</p> <p>To obtain this sort of data, the researchers utilised the same techniques as are used in the gaming industry, where markers are used to obtain quantitative information about what’s happening – similar to the team’s work with Google.</p> <p>One thing that simplifies the gathering of quantitative information from these videos is the ability to ‘subtract’ the background, so that only the athlete remains. But doing this is no easy task, especially as the boards of the velodrome and the legs of the cyclist are close to the same colour. Additionally, things that might appear minor to the human eye, such as shadows or changes in the light, make the maths of doing this type of subtraction extremely complicated. Working with undergraduate students, graduate students and postdoctoral researchers, however, Lasenby’s team has managed to develop real-time subtraction methods to extract the data that may give the British team the edge as they prepare for the Rio Olympics in 2016.</p> <p>“Technology is massively important in sport,” said Lasenby. “ ֱ̽techniques we’re developing here are helping to advance how we experience sport, both as athletes and as fans.”</p> <p><em>Inset images: credit British Cycling</em></p> </div> <p> </p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>New methods of gathering quantitative data from video – whether shot on a mobile phone or an ultra-high definition camera – may change the way that sport is experienced, for athletes and fans alike.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"> ֱ̽techniques we’re developing here are helping to advance how we experience sport, both as athletes and as fans</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Dr Joan Lasenby</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.flickr.com/photos/britishcycling/15759116409/" target="_blank">British Cycling</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even"> ֱ̽view from the top of the stands of Lee Valley VeloPark, London.</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by-nc-sa/3.0/">Creative Commons Licence</a>. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.</p> <p><a href="http://creativecommons.org/licenses/by-nc-sa/3.0/"><img alt="" src="/sites/www.cam.ac.uk/files/80x15.png" style="width: 80px; height: 15px;" /></a></p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 06 Feb 2015 15:08:06 +0000 sc604 145001 at A new model for industrial–academic partnership /research/news/new-model-for-industrial-academic-partnership <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/discovery-fund-credit-cambridge-enterprise.jpg?itok=w-M2JjjE" alt="Discovery Fund" title="Discovery Fund, Credit: Cambridge Enterprise" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><div> <p>PneumaCare is solving the problem of how to monitor lung function in babies, children and chronically sick patients using a non-invasive medical device. ֱ̽idea for the device, which combines innovative image processing with technologies from the gaming and movie industry, has been developed by a consortium of experts that includes Dr Joan Lasenby at the Department of Engineering, Dr Richard Iles at Cambridge ֱ̽ Hospitals NHS Foundation Trust and Dr Colin Smithers of Plextek Ltd.</p> <p> ֱ̽company represents a new and interesting departure from the usual spin-out model, as Dr Gareth Roberts, PneumaCare Chief Executive, explained: ‘Recognising an unmet medical need, the company consulted and utilised ֱ̽ expertise to create an innovative product. We have developed a close working relationship with the academics involved and, to cement this relationship, the academic partners have become equity holders. ֱ̽success of this model ensures that the ֱ̽ shares in the company’s success.’</p> <p>PneumaCare will present data from its first product, PneumaScan™, over the next few months. ‘We believe that the PneumaScan will make monitoring feasible, effective and simpler, leading to better patient recovery,’ said Dr Roberts. ‘We have generated considerable interest in the investment community and are poised to go into full clinical development and medical trials.’</p> <p>Part of this investment has come from the newly created ֱ̽ of Cambridge Discovery Fund, which is managed by Cambridge Enterprise Ltd. ֱ̽fund was created to smooth the path of transferring ֱ̽-related technologies for the benefit of society by providing proof of concept, pre-licence, pre-seed and seed investments, and is capitalised from donations through the ֱ̽ of Cambridge 800th Anniversary campaign.</p> </div> <p>For more information about the ֱ̽ of Cambridge Discovery Fund, please contact Cambridge Enterprise Ltd (Tel: +44 (0)1223 760339; email: <a href="mailto:enquires@enterprise.cam.ac.uk">enquires@enterprise.cam.ac.uk</a>) or visit <a href="https://www.enterprise.cam.ac.uk/">www.enterprise.cam.ac.uk/</a></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>PneumaCare, the first company to receive funding from the ֱ̽ of Cambridge Discovery Fund, is a new model for utilising academic expertise.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Cambridge Enterprise</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Discovery Fund</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by-nc-sa/3.0/"><img alt="" src="/sites/www.cam.ac.uk/files/80x15.png" style="width: 80px; height: 15px;" /></a></p> <p>This work is licensed under a <a href="http://creativecommons.org/licenses/by-nc-sa/3.0/">Creative Commons Licence</a>. If you use this content on your site please link back to this page.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 01 Jan 2010 15:54:49 +0000 lw355 25952 at