探花直播 of Cambridge - David Hardman /taxonomy/people/david-hardman en Robot trained to read braille at twice the speed of humans /research/news/robot-trained-to-read-braille-at-twice-the-speed-of-humans <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/img-4841-dp.jpg?itok=RoYah_Zz" alt="Robot braille reader" title="Robot braille reader, Credit: Parth Potdar" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> 探花直播research team, from the 探花直播 of Cambridge, used machine learning algorithms to teach a robotic sensor to quickly slide over lines of braille text. 探花直播robot was able to read the braille at 315 words per minute at close to 90% accuracy.</p> <p>Although the robot braille reader was not developed as an assistive technology, the researchers say the high sensitivity required to read braille makes it an ideal test in the development of robot hands or prosthetics with comparable sensitivity to human fingertips. 探花直播<a href="https://ieeexplore.ieee.org/document/10410896">results</a> are reported in the journal <em>IEEE Robotics and Automation Letters</em>.</p> <p>Human fingertips are remarkably sensitive and help us gather information about the world around us. Our fingertips can detect tiny changes in the texture of a material or help us know how much force to use when grasping an object: for example, picking up an egg without breaking it or a bowling ball without dropping it.</p> <p>Reproducing that level of sensitivity in a robotic hand, in an energy-efficient way, is a big engineering challenge. In <a href="https://birlab.org/">Professor Fumiya Iida鈥檚 lab</a> in Cambridge鈥檚 Department of Engineering, researchers are developing solutions to this and other skills that humans find easy, but robots find difficult.</p> <p>鈥 探花直播softness of human fingertips is one of the reasons we鈥檙e able to grip things with the right amount of pressure,鈥 said Parth Potdar from Cambridge鈥檚 Department of Engineering and an undergraduate at Pembroke College, the paper鈥檚 first author. 鈥淔or robotics, softness is a useful characteristic, but you also need lots of sensor information, and it鈥檚 tricky to have both at once, especially when dealing with flexible or deformable surfaces.鈥</p> <p>Braille is an ideal test for a robot 鈥榝ingertip鈥 as reading it requires high sensitivity, since the dots in each representative letter pattern are so close together. 探花直播researchers used an off-the-shelf sensor to develop a robotic braille reader that more accurately replicates human reading behaviour.</p> <p>鈥淭here are existing robotic braille readers, but they only read one letter at a time, which is not how humans read,鈥 said co-author David Hardman, also from the Department of Engineering. 鈥淓xisting robotic braille readers work in a static way: they touch one letter pattern, read it, pull up from the surface, move over, lower onto the next letter pattern, and so on. We want something that鈥檚 more realistic and far more efficient.鈥</p> <p> 探花直播robotic sensor the researchers used has a camera in its 鈥榝ingertip鈥, and reads by using a combination of the information from the camera and the sensors. 鈥淭his is a hard problem for roboticists as there鈥檚 a lot of image processing that needs to be done to remove motion blur, which is time and energy-consuming,鈥 said Potdar.</p> <p> 探花直播team developed machine learning algorithms so the robotic reader would be able to 鈥榙eblur鈥 the images before the sensor attempted to recognise the letters. They trained the algorithm on a set of sharp images of braille with fake blur applied. After the algorithm had learned to deblur the letters, they used a computer vision model to detect and classify each character.</p> <p>Once the algorithms were incorporated, the researchers tested their reader by sliding it quickly along rows of braille characters. 探花直播robotic braille reader could read at 315 words per minute at 87% accuracy, which is twice as fast and about as accurate as a human Braille reader.</p> <p>鈥淐onsidering that we used fake blur the train the algorithm, it was surprising how accurate it was at reading braille,鈥 said Hardman. 鈥淲e found a nice trade-off between speed and accuracy, which is also the case with human readers.鈥</p> <p>鈥淏raille reading speed is a great way to measure the dynamic performance of tactile sensing systems, so our findings could be applicable beyond braille, for applications like detecting surface textures or slippage in robotic manipulation,鈥 said Potdar.</p> <p>In future, the researchers are hoping to scale the technology to the size of a humanoid hand or skin. 探花直播research was supported in part by the Samsung Global Research Outreach Program.</p> <p>聽</p> <p><em><strong>Reference:</strong><br /> Parth Potdar et al. 鈥<a href="https://ieeexplore.ieee.org/document/10410896">High-Speed Tactile Braille Reading via Biomimetic Sliding Interactions</a>.鈥 IEEE Robotics and Automation Letters (2024). DOI: 10.1109/LRA.2024.3356978</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a robotic sensor that incorporates artificial intelligence techniques to read braille at speeds roughly double that of most human readers.</p> </p></div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-217601" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/can-robots-read-braille">Can robots read braille?</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/xqtA2Z668Ic?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Parth Potdar</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robot braille reader</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> 探花直播text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright 漏 探花直播 of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways 鈥 on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 29 Jan 2024 06:04:52 +0000 sc604 244161 at Self-healing materials for robotics made from 鈥榡elly鈥 and salt /research/news/self-healing-materials-for-robotics-made-from-jelly-and-salt <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/selfhealingrobotics.jpg?itok=IX6Jk8iI" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> 探花直播low-cost jelly-like materials, developed by researchers at the 探花直播 of Cambridge, can sense strain, temperature and humidity. And unlike earlier self-healing robots, they can also partially repair themselves at room temperature.</p>&#13; &#13; <p> 探花直播<a href="https://www.nature.com/articles/s41427-022-00357-9">results</a> are reported in the journal <em>NPG Asia Materials</em>.</p>&#13; &#13; <p>Soft sensing technologies could transform robotics, tactile interfaces and wearable devices, among other applications. However, most soft sensing technologies aren鈥檛 durable and consume high amounts of energy.</p>&#13; &#13; <p>鈥淚ncorporating soft sensors into robotics allows us to get a lot more information from them, like how strain on our muscles allows our brains to get information about the state of our bodies,鈥 said David Hardman from Cambridge鈥檚 Department of Engineering, the paper鈥檚 first author.</p>&#13; &#13; <p>As part of the EU-funded SHERO project, Hardman and his colleagues have been working to develop soft sensing, self-healing materials for robotic hands and arms. These materials can detect when they are damaged, take the necessary steps to temporarily heal themselves and then resume work 鈥 all without the need for human interaction.</p>&#13; &#13; <p>鈥淲e鈥檝e been working with self-healing materials for several years, but now we鈥檙e looking into faster and cheaper ways to make self-healing robots,鈥 said co-author Dr Thomas George-Thuruthel, also from the Department of Engineering.</p>&#13; &#13; <p>Earlier versions of the self-healing robots needed to be heated in order to heal, but the Cambridge researchers are now developing materials that can heal at room temperature, which would make them more useful for real-world applications.</p>&#13; &#13; <p>鈥淲e started with a stretchy, gelatine-based material which is cheap, biodegradable and biocompatible and carried out different tests on how to incorporate sensors into the material by adding in lots of conductive components,鈥 said Hardman.</p>&#13; &#13; <p> 探花直播researchers found that printing sensors containing sodium chloride 鈥 salt 鈥 instead of carbon ink resulted in a material with the properties they were looking for. Since salt is soluble in the water-filled hydrogel, it provides a uniform channel for ionic conduction 鈥 the movement of ions.</p>&#13; &#13; <p>When measuring the electrical resistance of the printed materials, the researchers found that changes in strain resulted in a highly linear response, which they could use to calculate the deformations of the material. Adding salt also enabled sensing of stretches of more than three times the sensor鈥檚 original length, so that the material can be incorporated into flexible and stretchable robotic devices.</p>&#13; &#13; <p> 探花直播self-healing materials are cheap and easy to make, either by 3D printing or casting. They are preferable to many existing alternatives since they show long-term strength and stability without drying out, and they are made entirely from widely available, food-safe, materials.</p>&#13; &#13; <p>鈥淚t鈥檚 a really good sensor considering how cheap and easy it is to make,鈥 said George-Thuruthel. 鈥淲e could make a whole robot out of gelatine and print the sensors wherever we need them.鈥</p>&#13; &#13; <p> 探花直播self-healing hydrogels bond well with a range of different materials, meaning they can easily be incorporated with other types of robotics. For example, much of the research in the <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, where the researchers are based, is focused on the development of artificial hands. Although this material is a proof-of-concept, if developed further, it could be incorporated into artificial skins and custom-made wearable and biodegradable sensors.</p>&#13; &#13; <p>This work was supported by the <a href="https://katamaluku.id/">Self-HEaling soft RObotics (SHERO)</a> project, funded under the Future and Emerging Technologies (FET) programme of the European Commission.</p>&#13; &#13; <p>聽</p>&#13; &#13; <p><em>R<strong>eference:</strong><br />&#13; David Hardman, Thomas George-Thuruthel, and Fumiya Iida. 鈥<a href="https://www.nature.com/articles/s41427-022-00357-9">Self-Healing Ionic Gelatin/Glycerol Hydrogels for Strain Sensing Applications</a>.鈥 NPG Asia Materials (2022). DOI: 10.1038/s41427-022-00357-9</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed self-healing, biodegradable, 3D-printed materials that could be used in the development of realistic artificial hands and other soft robotics applications.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">It鈥檚 a really good sensor considering how cheap and easy it is to make</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Thomas George-Thuruthel</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-192031" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/self-healing-robot-developed-by-cambridge-uni-engineers">Self healing robot developed by Cambridge Uni engineers</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-2 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/eVH0YCeI464?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; 探花直播text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright 漏 探花直播 of Cambridge and licensors/contributors as identified.聽 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 18 Feb 2022 16:54:17 +0000 sc604 229951 at