ֱ̽ of Cambridge - Fumiya Iida /taxonomy/people/fumiya-iida en ‘Palaeo-robots’ to help scientists understand how fish started to walk on land /research/news/palaeo-robots-to-help-scientists-understand-how-fish-started-to-walk-on-land <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/paleo-robots-883x432.jpg?itok=rSGMB0cY" alt="Illustration of palaeo-robots." title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://doi.org/10.1126/scirobotics.adn1125">Writing</a> in the journal <em>Science Robotics</em>, the research team, led by the ֱ̽ of Cambridge, outline how ‘palaeo-inspired robotics’ could provide a valuable experimental approach to studying how the pectoral and pelvic fins of ancient fish evolved to support weight on land.</p> <p>“Since fossil evidence is limited, we have an incomplete picture of how ancient life made the transition to land,” said lead author <a href="https://www.michaelishida.com/">Dr Michael Ishida</a> from Cambridge’s Department of Engineering. “Palaeontologists examine ancient fossils for clues about the structure of hip and pelvic joints, but there are limits to what we can learn from fossils alone. That’s where robots can come in, helping us fill gaps in the research, particularly when studying major shifts in how vertebrates moved.”</p> <p>Ishida is a member of Cambridge’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, led by Professor Fumiya Iida. ֱ̽team is developing energy-efficient robots for a variety of applications, which take their inspiration from the efficient ways that animals and humans move.</p> <p>With funding from the Human Frontier Science Program, the team is developing palaeo-inspired robots, in part by taking their inspiration from modern-day ‘walking fish’ such as mudskippers, and from fossils of extinct fish. “In the lab, we can’t make a living fish walk differently, and we certainly can’t get a fossil to move, so we’re using robots to simulate their anatomy and behaviour,” said Ishida.</p> <p> ֱ̽team is creating robotic analogues of ancient fish skeletons, complete with mechanical joints that mimic muscles and ligaments. Once complete, the team will perform experiments on these robots to determine how these ancient creatures might have moved.</p> <p>“We want to know things like how much energy different walking patterns would have required, or which movements were most efficient,” said Ishida. “This data can help confirm or challenge existing theories about how these early animals evolved.”</p> <p>One of the biggest challenges in this field is the lack of comprehensive fossil records. Many of the ancient species from this period in Earth’s history are known only from partial skeletons, making it difficult to reconstruct their full range of movement.</p> <p>“In some cases, we’re just guessing how certain bones connected or functioned,” said Ishida. “That’s why robots are so useful—they help us confirm these guesses and provide new evidence to support or rebut them.”</p> <p>While robots are commonly used to study movement in living animals, very few research groups are using them to study extinct species. “There are only a few groups doing this kind of work,” said Ishida. “But we think it’s a natural fit – robots can provide insights into ancient animals that we simply can’t get from fossils or modern species alone.”</p> <p> ֱ̽team hopes that their work will encourage other researchers to explore the potential of robotics to study the biomechanics of long-extinct animals. “We’re trying to close the loop between fossil evidence and real-world mechanics,” said Ishida. “Computer models are obviously incredibly important in this area of research, but since robots are interacting with the real world, they can help us test theories about how these creatures moved, and maybe even why they moved the way they did.”</p> <p> ֱ̽team is currently in the early stages of building their palaeo-robots, but they hope to have some results within the next year. ֱ̽researchers say they hope their robot models will not only deepen understanding of evolutionary biology, but could also open up new avenues of collaboration between engineers and researchers in other fields.</p> <p> ֱ̽research was supported by the Human Frontier Science Program. Fumiya Iida is a Fellow of Corpus Christi College, Cambridge. Michael Ishida a Postdoctoral Research Associate at Gonville and Caius College, Cambridge.</p> <p><em><strong>Reference:</strong><br /> Michael Ishida et al. ‘<a href="https://doi.org/10.1126/scirobotics.adn1125">Paleo-inspired robotics as an experimental approach to the history of life</a>.’ Science Robotics (2024). DOI: 10.1126/scirobotics.adn1125</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽transition from water to land is one of the most significant events in the history of life on Earth. Now, a team of roboticists, palaeontologists and biologists is using robots to study how the ancestors of modern land animals transitioned from swimming to walking, about 390 million years ago.</p> </p></div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 23 Oct 2024 18:00:00 +0000 sc604 248514 at Robot trained to read braille at twice the speed of humans /research/news/robot-trained-to-read-braille-at-twice-the-speed-of-humans <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/img-4841-dp.jpg?itok=RoYah_Zz" alt="Robot braille reader" title="Robot braille reader, Credit: Parth Potdar" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽research team, from the ֱ̽ of Cambridge, used machine learning algorithms to teach a robotic sensor to quickly slide over lines of braille text. ֱ̽robot was able to read the braille at 315 words per minute at close to 90% accuracy.</p> <p>Although the robot braille reader was not developed as an assistive technology, the researchers say the high sensitivity required to read braille makes it an ideal test in the development of robot hands or prosthetics with comparable sensitivity to human fingertips. ֱ̽<a href="https://ieeexplore.ieee.org/document/10410896">results</a> are reported in the journal <em>IEEE Robotics and Automation Letters</em>.</p> <p>Human fingertips are remarkably sensitive and help us gather information about the world around us. Our fingertips can detect tiny changes in the texture of a material or help us know how much force to use when grasping an object: for example, picking up an egg without breaking it or a bowling ball without dropping it.</p> <p>Reproducing that level of sensitivity in a robotic hand, in an energy-efficient way, is a big engineering challenge. In <a href="https://birlab.org/">Professor Fumiya Iida’s lab</a> in Cambridge’s Department of Engineering, researchers are developing solutions to this and other skills that humans find easy, but robots find difficult.</p> <p>“ ֱ̽softness of human fingertips is one of the reasons we’re able to grip things with the right amount of pressure,” said Parth Potdar from Cambridge’s Department of Engineering and an undergraduate at Pembroke College, the paper’s first author. “For robotics, softness is a useful characteristic, but you also need lots of sensor information, and it’s tricky to have both at once, especially when dealing with flexible or deformable surfaces.”</p> <p>Braille is an ideal test for a robot ‘fingertip’ as reading it requires high sensitivity, since the dots in each representative letter pattern are so close together. ֱ̽researchers used an off-the-shelf sensor to develop a robotic braille reader that more accurately replicates human reading behaviour.</p> <p>“There are existing robotic braille readers, but they only read one letter at a time, which is not how humans read,” said co-author David Hardman, also from the Department of Engineering. “Existing robotic braille readers work in a static way: they touch one letter pattern, read it, pull up from the surface, move over, lower onto the next letter pattern, and so on. We want something that’s more realistic and far more efficient.”</p> <p> ֱ̽robotic sensor the researchers used has a camera in its ‘fingertip’, and reads by using a combination of the information from the camera and the sensors. “This is a hard problem for roboticists as there’s a lot of image processing that needs to be done to remove motion blur, which is time and energy-consuming,” said Potdar.</p> <p> ֱ̽team developed machine learning algorithms so the robotic reader would be able to ‘deblur’ the images before the sensor attempted to recognise the letters. They trained the algorithm on a set of sharp images of braille with fake blur applied. After the algorithm had learned to deblur the letters, they used a computer vision model to detect and classify each character.</p> <p>Once the algorithms were incorporated, the researchers tested their reader by sliding it quickly along rows of braille characters. ֱ̽robotic braille reader could read at 315 words per minute at 87% accuracy, which is twice as fast and about as accurate as a human Braille reader.</p> <p>“Considering that we used fake blur the train the algorithm, it was surprising how accurate it was at reading braille,” said Hardman. “We found a nice trade-off between speed and accuracy, which is also the case with human readers.”</p> <p>“Braille reading speed is a great way to measure the dynamic performance of tactile sensing systems, so our findings could be applicable beyond braille, for applications like detecting surface textures or slippage in robotic manipulation,” said Potdar.</p> <p>In future, the researchers are hoping to scale the technology to the size of a humanoid hand or skin. ֱ̽research was supported in part by the Samsung Global Research Outreach Program.</p> <p> </p> <p><em><strong>Reference:</strong><br /> Parth Potdar et al. ‘<a href="https://ieeexplore.ieee.org/document/10410896">High-Speed Tactile Braille Reading via Biomimetic Sliding Interactions</a>.’ IEEE Robotics and Automation Letters (2024). DOI: 10.1109/LRA.2024.3356978</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a robotic sensor that incorporates artificial intelligence techniques to read braille at speeds roughly double that of most human readers.</p> </p></div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-217601" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/can-robots-read-braille">Can robots read braille?</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/xqtA2Z668Ic?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Parth Potdar</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robot braille reader</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 29 Jan 2024 06:04:52 +0000 sc604 244161 at Robot ‘chef’ learns to recreate recipes from watching food videos /research/news/robot-chef-learns-to-recreate-recipes-from-watching-food-videos <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/untitled-3_1.jpg?itok=RV53FI1P" alt="Robot arm reaching for a piece of broccoli" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, programmed their robotic chef with a ‘cookbook’ of eight simple salad recipes. After watching a video of a human demonstrating one of the recipes, the robot was able to identify which recipe was being prepared and make it.</p>&#13; &#13; <p>In addition, the videos helped the robot incrementally add to its cookbook. At the end of the experiment, the robot came up with a ninth recipe on its own. Their <a href="https://ieeexplore.ieee.org/document/10124218">results</a>, reported in the journal <em>IEEE Access</em>, demonstrate how video content can be a valuable and rich source of data for automated food production, and could enable easier and cheaper deployment of robot chefs.</p>&#13; &#13; <p>Robotic chefs have been featured in science fiction for decades, but in reality, cooking is a challenging problem for a robot. Several commercial companies have built prototype robot chefs, although none of these are currently commercially available, and they lag well behind their human counterparts in terms of skill.</p>&#13; &#13; <p>Human cooks can learn new recipes through observation, whether that’s watching another person cook or watching a video on YouTube, but programming a robot to make a range of dishes is costly and time-consuming.</p>&#13; &#13; <p>“We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can – by identifying the ingredients and how they go together in the dish,” said Grzegorz Sochacki from Cambridge’s Department of Engineering, the paper’s first author.</p>&#13; &#13; <p>Sochacki, a PhD candidate in Professor Fumiya Iida’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, and his colleagues devised eight simple salad recipes and filmed themselves making them. They then used a publicly available neural network to train their robot chef. ֱ̽neural network had already been programmed to identify a range of different objects, including the fruits and vegetables used in the eight salad recipes (broccoli, carrot, apple, banana and orange).</p>&#13; &#13; <p>Using computer vision techniques, the robot analysed each frame of video and was able to identify the different objects and features, such as a knife and the ingredients, as well as the human demonstrator’s arms, hands and face. Both the recipes and the videos were converted to vectors and the robot performed mathematical operations on the vectors to determine the similarity between a demonstration and a vector.</p>&#13; &#13; <p>By correctly identifying the ingredients and the actions of the human chef, the robot could determine which of the recipes was being prepared. ֱ̽robot could infer that if the human demonstrator was holding a knife in one hand and a carrot in the other, the carrot would then get chopped up.</p>&#13; &#13; <p>Of the 16 videos it watched, the robot recognised the correct recipe 93% of the time, even though it only detected 83% of the human chef’s actions. ֱ̽robot was also able to detect that slight variations in a recipe, such as making a double portion or normal human error, were variations and not a new recipe. ֱ̽robot also correctly recognised the demonstration of a new, ninth salad, added it to its cookbook and made it.</p>&#13; &#13; <p>“It’s amazing how much nuance the robot was able to detect,” said Sochacki. “These recipes aren’t complex – they’re essentially chopped fruits and vegetables, but it was really effective at recognising, for example, that two chopped apples and two chopped carrots is the same recipe as three chopped apples and three chopped carrots.”  </p>&#13; &#13; <p> ֱ̽videos used to train the robot chef are not like the food videos made by some social media influencers, which are full of fast cuts and visual effects, and quickly move back and forth between the person preparing the food and the dish they’re preparing. For example, the robot would struggle to identify a carrot if the human demonstrator had their hand wrapped around it – for the robot to identify the carrot, the human demonstrator had to hold up the carrot so that the robot could see the whole vegetable.</p>&#13; &#13; <p>“Our robot isn’t interested in the sorts of food videos that go viral on social media – they’re simply too hard to follow,” said Sochacki. “But as these robot chefs get better and faster at identifying ingredients in food videos, they might be able to use sites like YouTube to learn a whole range of recipes.”</p>&#13; &#13; <p> ֱ̽research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).</p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Grzegorz Sochacki et al. ‘<a href="https://ieeexplore.ieee.org/document/10124218">Recognition of Human Chef’s Intentions for Incremental Learning of Cookbook by Robotic Salad Chef</a>.’ IEEE Access (2023). DOI: 10.1109/ACCESS.2023.3276234</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have trained a robotic ‘chef’ to watch and learn from cooking videos, and recreate the dish itself.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can – by identifying the ingredients and how they go together in the dish</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Greg Sochacki</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-208991" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/robot-chef-learns-to-recreate-recipes-from-watching-food-videos">Robot ‘chef’ learns to recreate recipes from watching food videos</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-2 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/nx3k4XA3x4Q?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/social-media/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 05 Jun 2023 01:00:00 +0000 sc604 239811 at It’s all in the wrist: energy-efficient robot hand learns how not to drop the ball /stories/robotic-hand <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have designed a low-cost, energy-efficient robotic hand that can grasp a range of objects – and not drop them – using just the movement of its wrist and the feeling in its ‘skin’.  </p> </p></div></div></div> Wed, 12 Apr 2023 03:23:34 +0000 sc604 238441 at Taste of the future: robot chef learns to ‘taste as you go’ /research/news/taste-of-the-future-robot-chef-learns-to-taste-as-you-go <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/chef.jpg?itok=zwU4FEoU" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Working in collaboration with domestic appliances manufacturer Beko, researchers from the ֱ̽ of Cambridge trained their robot chef to assess the saltiness of a dish at different stages of the chewing process, imitating a similar process in humans.</p> <p>Their results could be useful in the development of automated or semi-automated food preparation by helping robots to learn what tastes good and what doesn’t, making them better cooks.</p> <p>When we chew our food, we notice a change in texture and taste. For example, biting into a fresh tomato at the height of summer will release juices, and as we chew, releasing both saliva and digestive enzymes, our perception of the tomato’s flavour will change.</p> <p> ֱ̽robot chef, which has already been trained to make omelettes based on human taster’s feedback, tasted nine different variations of a simple dish of scrambled eggs and tomatoes at three different stages of the chewing process, and produced ‘taste maps’ of the different dishes.</p> <p> ֱ̽researchers found that this ‘taste as you go’ approach significantly improved the robot’s ability to quickly and accurately assess the saltiness of the dish over other electronic tasting technologies, which only test a single homogenised sample. ֱ̽<a href="https://www.frontiersin.org/articles/10.3389/frobt.2022.886074/abstract">results</a> are reported in the journal <em>Frontiers in Robotics &amp; AI</em>.</p> <p> ֱ̽perception of taste is a complex process in humans that has evolved over millions of years: the appearance, smell, texture and temperature of food all affect how we perceive taste; the saliva produced during chewing helps carry chemical compounds in food to taste receptors mostly on the tongue; and the signals from taste receptors are passed to the brain. Once our brains are aware of the flavour, we decide whether we enjoy the food or not.</p> <p>Taste is also highly individual: some people love spicy food, while others have a sweet tooth. A good cook, whether amateur or professional, relies on their sense of taste, and can balance the various flavours within a dish to make a well-rounded final product.</p> <p>“Most home cooks will be familiar with the concept of tasting as you go – checking a dish throughout the cooking process to check whether the balance of flavours is right,” said Grzegorz Sochacki from Cambridge’s Department of Engineering, the paper’s first author. “If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking.”</p> <p>“When we taste, the process of chewing also provides continuous feedback to our brains,” said co-author Dr Arsen Abdulali, also from the Department of Engineering. “Current methods of electronic testing only take a single snapshot from a homogenised sample, so we wanted to replicate a more realistic process of chewing and tasting in a robotic system, which should result in a tastier end product.”</p> <p> ֱ̽researchers are members of Cambridge’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a> run by <a href="http://mi.eng.cam.ac.uk/Main/FI224">Professor Fumiya Iida</a> of the Department of Engineering, which focuses on training robots to carry out the so-called last metre problems which humans find easy, but robots find difficult. Cooking is one of these tasks: earlier tests with their robot ‘chef’ have produced a passable omelette using feedback from human tasters.</p> <p>“We needed something cheap, small and fast to add to our robot so it could do the tasting: it needed to be cheap enough to use in a kitchen, small enough for a robot, and fast enough to use while cooking,” said Sochacki.</p> <p>To imitate the human process of chewing and tasting in their robot chef, the researchers attached a conductance probe, which acts as a salinity sensor, to a robot arm. They prepared scrambled eggs and tomatoes, varying the number of tomatoes and the amount of salt in each dish.</p> <p>Using the probe, the robot ‘tasted’ the dishes in a grid-like fashion, returning a reading in just a few seconds.</p> <p>To imitate the change in texture caused by chewing, the team then put the egg mixture in a blender and had the robot test the dish again. ֱ̽different readings at different points of ‘chewing’ produced taste maps of each dish.</p> <p>Their results showed a significant improvement in the ability of robots to assess saltiness over other electronic tasting methods, which are often time-consuming and only provide a single reading.</p> <p>While their technique is a proof of concept, the researchers say that by imitating the human processes of chewing and tasting, robots will eventually be able to produce food that humans will enjoy and could be tweaked according to individual tastes.</p> <p>“When a robot is learning how to cook, like any other cook, it needs indications of how well it did,” said Abdulali. “We want the robots to understand the concept of taste, which will make them better cooks. In our experiment, the robot can ‘see’ the difference in the food as it’s chewed, which improves its ability to taste.”</p> <p>“Beko has a vision to bring robots to the home environment which are safe and easy to use,” said Dr Muhammad W. Chughtai, Senior Scientist at Beko plc. “We believe that the development of robotic chefs will play a major role in busy households and assisted living homes in the future. This result is a leap forward in robotic cooking, and by using machine and deep learning algorithms, mastication will help robot chefs adjust taste for different dishes and users.”</p> <p>In future, the researchers are looking to improve the robot chef so it can taste different types of food and improve sensing capabilities so it can taste sweet or oily food, for example.</p> <p> ֱ̽research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC) Centre of Doctoral Training on Agri-Food Robotics (Agriforwards CDT). EPSRC is part of UK Research and Innovation (UKRI). Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.</p> <p> </p> <p><em><strong>Reference:</strong><br /> Grzegorz Sochacki, Arsen Abdulali, and Fumiya Iida. ‘<a href="https://www.frontiersin.org/articles/10.3389/frobt.2022.886074/abstract">Mastication-Enhanced Taste-Based Classification of Multi-Ingredient Dishes for Robotic Cooking</a>.’ Frontiers in Robotics &amp; AI (2022). DOI: 10.3389/frobt.2022.886074</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A robot ‘chef’ has been trained to taste food at different stages of the chewing process to assess whether it’s sufficiently seasoned.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Grzegorz Sochacki</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-194681" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/taste-of-the-future-robot-chef-learns-to-taste-as-you-go">Taste of the future: robot chef learns to ‘taste as you go’</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-3 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/nZ0xviqzUJg?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 04 May 2022 04:00:00 +0000 sc604 231861 at Self-healing materials for robotics made from ‘jelly’ and salt /research/news/self-healing-materials-for-robotics-made-from-jelly-and-salt <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/selfhealingrobotics.jpg?itok=IX6Jk8iI" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽low-cost jelly-like materials, developed by researchers at the ֱ̽ of Cambridge, can sense strain, temperature and humidity. And unlike earlier self-healing robots, they can also partially repair themselves at room temperature.</p>&#13; &#13; <p> ֱ̽<a href="https://www.nature.com/articles/s41427-022-00357-9">results</a> are reported in the journal <em>NPG Asia Materials</em>.</p>&#13; &#13; <p>Soft sensing technologies could transform robotics, tactile interfaces and wearable devices, among other applications. However, most soft sensing technologies aren’t durable and consume high amounts of energy.</p>&#13; &#13; <p>“Incorporating soft sensors into robotics allows us to get a lot more information from them, like how strain on our muscles allows our brains to get information about the state of our bodies,” said David Hardman from Cambridge’s Department of Engineering, the paper’s first author.</p>&#13; &#13; <p>As part of the EU-funded SHERO project, Hardman and his colleagues have been working to develop soft sensing, self-healing materials for robotic hands and arms. These materials can detect when they are damaged, take the necessary steps to temporarily heal themselves and then resume work – all without the need for human interaction.</p>&#13; &#13; <p>“We’ve been working with self-healing materials for several years, but now we’re looking into faster and cheaper ways to make self-healing robots,” said co-author Dr Thomas George-Thuruthel, also from the Department of Engineering.</p>&#13; &#13; <p>Earlier versions of the self-healing robots needed to be heated in order to heal, but the Cambridge researchers are now developing materials that can heal at room temperature, which would make them more useful for real-world applications.</p>&#13; &#13; <p>“We started with a stretchy, gelatine-based material which is cheap, biodegradable and biocompatible and carried out different tests on how to incorporate sensors into the material by adding in lots of conductive components,” said Hardman.</p>&#13; &#13; <p> ֱ̽researchers found that printing sensors containing sodium chloride – salt – instead of carbon ink resulted in a material with the properties they were looking for. Since salt is soluble in the water-filled hydrogel, it provides a uniform channel for ionic conduction – the movement of ions.</p>&#13; &#13; <p>When measuring the electrical resistance of the printed materials, the researchers found that changes in strain resulted in a highly linear response, which they could use to calculate the deformations of the material. Adding salt also enabled sensing of stretches of more than three times the sensor’s original length, so that the material can be incorporated into flexible and stretchable robotic devices.</p>&#13; &#13; <p> ֱ̽self-healing materials are cheap and easy to make, either by 3D printing or casting. They are preferable to many existing alternatives since they show long-term strength and stability without drying out, and they are made entirely from widely available, food-safe, materials.</p>&#13; &#13; <p>“It’s a really good sensor considering how cheap and easy it is to make,” said George-Thuruthel. “We could make a whole robot out of gelatine and print the sensors wherever we need them.”</p>&#13; &#13; <p> ֱ̽self-healing hydrogels bond well with a range of different materials, meaning they can easily be incorporated with other types of robotics. For example, much of the research in the <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, where the researchers are based, is focused on the development of artificial hands. Although this material is a proof-of-concept, if developed further, it could be incorporated into artificial skins and custom-made wearable and biodegradable sensors.</p>&#13; &#13; <p>This work was supported by the <a href="https://katamaluku.id/">Self-HEaling soft RObotics (SHERO)</a> project, funded under the Future and Emerging Technologies (FET) programme of the European Commission.</p>&#13; &#13; <p> </p>&#13; &#13; <p><em>R<strong>eference:</strong><br />&#13; David Hardman, Thomas George-Thuruthel, and Fumiya Iida. ‘<a href="https://www.nature.com/articles/s41427-022-00357-9">Self-Healing Ionic Gelatin/Glycerol Hydrogels for Strain Sensing Applications</a>.’ NPG Asia Materials (2022). DOI: 10.1038/s41427-022-00357-9</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed self-healing, biodegradable, 3D-printed materials that could be used in the development of realistic artificial hands and other soft robotics applications.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">It’s a really good sensor considering how cheap and easy it is to make</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Thomas George-Thuruthel</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-192031" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/self-healing-robot-developed-by-cambridge-uni-engineers">Self healing robot developed by Cambridge Uni engineers</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-4 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/eVH0YCeI464?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 18 Feb 2022 16:54:17 +0000 sc604 229951 at ֱ̽power of touch /stories/human-touch-fitzwilliam-museum <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p data-pm-slice="1 1 []">As a major Fitzwilliam Museum exhibition explores human touch through 4,000 years of art, Cambridge researchers explain why this sense is so important in their own work.</p> </p></div></div></div> Thu, 17 Jun 2021 05:30:00 +0000 ta385 224821 at A good egg: robot chef trained to make omelettes /research/news/a-good-egg-robot-chef-trained-to-make-omelettes <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/sequence0100013123still007.jpg?itok=ovls7YU1" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge in collaboration with domestic appliance company Beko, used machine learning to train the robot to account for highly subjective matters of taste. ֱ̽<a href="https://ieeexplore.ieee.org/document/8954776">results</a> are reported in the journal <em>IEEE Robotics and Automation Letters</em>, and will be available online as part of the virtual <a href="https://www.icra2020.org/program/program-overview">IEEE International Conference on Robotics and Automation</a> (ICRA 2020).</p>&#13; &#13; <p>A robot that can cook has been an aspiration of sci-fi authors, futurists, and scientists for decades. As artificial intelligence techniques have advanced, commercial companies have built prototype robot chefs, although none of these are currently commercially available, and they lag well behind their human counterparts in terms of skill.</p>&#13; &#13; <p>“Cooking is a really interesting problem for roboticists, as humans can never be totally objective when it comes to food, so how do we as scientists assess whether the robot has done a good job?” said Dr Fumiya Iida from Cambridge’s Department of Engineering, who led the research.</p>&#13; &#13; <p>Teaching a robot to prepare and cook food is a challenging task, since it must deal with complex problems in robot manipulation, computer vision, sensing and human-robot interaction, and produce a consistent end product.</p>&#13; &#13; <p>In addition, taste differs from person to person – cooking is a qualitative task, while robots generally excel at quantitative tasks. Since taste is not universal, universal solutions don’t exist. Unlike other optimisation problems, special tools need to be developed for robots to prepare food.</p>&#13; &#13; <p>Other research groups have trained robots to make cookies, pancakes and even pizza, but these robot chefs have not been optimised for the many subjective variables involved in cooking.</p>&#13; &#13; <p>Egg dishes, omelettes in particular, have long been considered a test of culinary skill. A popular piece of French culinary mythology states that each of the one hundred pleats in a chef’s hat represents a different way to cook an egg, although the exact origin of this adage is unknown.</p>&#13; &#13; <p>“An omelette is one of those dishes that is easy to make, but difficult to make well,” said Iida. “We thought it would be an ideal test to improve the abilities of a robot chef, and optimise for taste, texture, smell and appearance.”</p>&#13; &#13; <p>In partnership with <a href="https://www.beko.co.uk/">Beko</a>, Iida and his colleagues trained their robot chef to prepare an omelette, from cracking the eggs through to plating the finished dish. ֱ̽work was performed in Cambridge’s Department of Engineering, using a test kitchen supplied by Beko plc and Symphony Group.</p>&#13; &#13; <p> ֱ̽machine learning technique developed by Iida’s team makes use of a statistical tool, called Bayesian Inference, to squeeze out as much information as possible from the limited amount of data samples, which was necessary to avoid over-stuffing the human tasters with omelettes.</p>&#13; &#13; <p>“Another challenge we faced was the subjectivity of human sense of taste - humans aren’t very good at giving absolute measures, and usually give relative ones when it comes to taste,” said Iida. “So we needed to tweak the machine learning algorithm - the so-called batch algorithm - so that human tasters could give information based on comparative evaluations, rather than sequential ones.”</p>&#13; &#13; <p>But how did the robot measure up as a chef? “ ֱ̽omelettes, in general, tasted great – much better than expected!” said Iida.</p>&#13; &#13; <p> ֱ̽results show that machine learning can be used to obtain quantifiable improvements in food optimisation. Additionally, such an approach can be easily extended to multiple robotic chefs. Further studies have to be conducted to investigate other optimisation techniques and their viability.</p>&#13; &#13; <p>“Beko is passionate about designing the kitchen of the future and believes robotics applications such as this will play a crucial part. We are very happy to be collaborating with Dr Iida on this important topic,” said Dr Graham Anderson, the industrial project supervisor from Beko’s Cambridge R&amp;D Centre.</p>&#13; &#13; <p><strong><em>Reference:</em></strong><br /><em>Kai Junge et al. ‘</em><a href="https://ieeexplore.ieee.org/document/8954776"><em>Improving Robotic Cooking using Batch Bayesian Optimization</em></a><em>.’ IEEE Robotics and Automation Letters (2020). DOI: 10.1109/LRA.2020.2965418</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A team of engineers have trained a robot to prepare an omelette, all the way from cracking the eggs to plating the finished dish, and refined the ‘chef’s’ culinary skills to produce a reliable dish that actually tastes good.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Cooking is a really interesting problem for roboticists, as humans can never be totally objective when it comes to food, so how do we as scientists assess whether the robot has done a good job?</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Fumiya Iida</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-161752" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/161752">Can robots make omelettes?</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-5 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/G99hk9Tfw9M?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 01 Jun 2020 11:40:51 +0000 sc604 215002 at