ֱ̽ of Cambridge - Grzegorz Sochacki /taxonomy/people/grzegorz-sochacki en Robot ‘chef’ learns to recreate recipes from watching food videos /research/news/robot-chef-learns-to-recreate-recipes-from-watching-food-videos <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/untitled-3_1.jpg?itok=RV53FI1P" alt="Robot arm reaching for a piece of broccoli" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, programmed their robotic chef with a ‘cookbook’ of eight simple salad recipes. After watching a video of a human demonstrating one of the recipes, the robot was able to identify which recipe was being prepared and make it.</p>&#13; &#13; <p>In addition, the videos helped the robot incrementally add to its cookbook. At the end of the experiment, the robot came up with a ninth recipe on its own. Their <a href="https://ieeexplore.ieee.org/document/10124218">results</a>, reported in the journal <em>IEEE Access</em>, demonstrate how video content can be a valuable and rich source of data for automated food production, and could enable easier and cheaper deployment of robot chefs.</p>&#13; &#13; <p>Robotic chefs have been featured in science fiction for decades, but in reality, cooking is a challenging problem for a robot. Several commercial companies have built prototype robot chefs, although none of these are currently commercially available, and they lag well behind their human counterparts in terms of skill.</p>&#13; &#13; <p>Human cooks can learn new recipes through observation, whether that’s watching another person cook or watching a video on YouTube, but programming a robot to make a range of dishes is costly and time-consuming.</p>&#13; &#13; <p>“We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can – by identifying the ingredients and how they go together in the dish,” said Grzegorz Sochacki from Cambridge’s Department of Engineering, the paper’s first author.</p>&#13; &#13; <p>Sochacki, a PhD candidate in Professor Fumiya Iida’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, and his colleagues devised eight simple salad recipes and filmed themselves making them. They then used a publicly available neural network to train their robot chef. ֱ̽neural network had already been programmed to identify a range of different objects, including the fruits and vegetables used in the eight salad recipes (broccoli, carrot, apple, banana and orange).</p>&#13; &#13; <p>Using computer vision techniques, the robot analysed each frame of video and was able to identify the different objects and features, such as a knife and the ingredients, as well as the human demonstrator’s arms, hands and face. Both the recipes and the videos were converted to vectors and the robot performed mathematical operations on the vectors to determine the similarity between a demonstration and a vector.</p>&#13; &#13; <p>By correctly identifying the ingredients and the actions of the human chef, the robot could determine which of the recipes was being prepared. ֱ̽robot could infer that if the human demonstrator was holding a knife in one hand and a carrot in the other, the carrot would then get chopped up.</p>&#13; &#13; <p>Of the 16 videos it watched, the robot recognised the correct recipe 93% of the time, even though it only detected 83% of the human chef’s actions. ֱ̽robot was also able to detect that slight variations in a recipe, such as making a double portion or normal human error, were variations and not a new recipe. ֱ̽robot also correctly recognised the demonstration of a new, ninth salad, added it to its cookbook and made it.</p>&#13; &#13; <p>“It’s amazing how much nuance the robot was able to detect,” said Sochacki. “These recipes aren’t complex – they’re essentially chopped fruits and vegetables, but it was really effective at recognising, for example, that two chopped apples and two chopped carrots is the same recipe as three chopped apples and three chopped carrots.”  </p>&#13; &#13; <p> ֱ̽videos used to train the robot chef are not like the food videos made by some social media influencers, which are full of fast cuts and visual effects, and quickly move back and forth between the person preparing the food and the dish they’re preparing. For example, the robot would struggle to identify a carrot if the human demonstrator had their hand wrapped around it – for the robot to identify the carrot, the human demonstrator had to hold up the carrot so that the robot could see the whole vegetable.</p>&#13; &#13; <p>“Our robot isn’t interested in the sorts of food videos that go viral on social media – they’re simply too hard to follow,” said Sochacki. “But as these robot chefs get better and faster at identifying ingredients in food videos, they might be able to use sites like YouTube to learn a whole range of recipes.”</p>&#13; &#13; <p> ֱ̽research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).</p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Grzegorz Sochacki et al. ‘<a href="https://ieeexplore.ieee.org/document/10124218">Recognition of Human Chef’s Intentions for Incremental Learning of Cookbook by Robotic Salad Chef</a>.’ IEEE Access (2023). DOI: 10.1109/ACCESS.2023.3276234</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have trained a robotic ‘chef’ to watch and learn from cooking videos, and recreate the dish itself.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can – by identifying the ingredients and how they go together in the dish</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Greg Sochacki</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-208991" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/robot-chef-learns-to-recreate-recipes-from-watching-food-videos">Robot ‘chef’ learns to recreate recipes from watching food videos</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/nx3k4XA3x4Q?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/social-media/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 05 Jun 2023 01:00:00 +0000 sc604 239811 at Taste of the future: robot chef learns to ‘taste as you go’ /research/news/taste-of-the-future-robot-chef-learns-to-taste-as-you-go <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/chef.jpg?itok=zwU4FEoU" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Working in collaboration with domestic appliances manufacturer Beko, researchers from the ֱ̽ of Cambridge trained their robot chef to assess the saltiness of a dish at different stages of the chewing process, imitating a similar process in humans.</p> <p>Their results could be useful in the development of automated or semi-automated food preparation by helping robots to learn what tastes good and what doesn’t, making them better cooks.</p> <p>When we chew our food, we notice a change in texture and taste. For example, biting into a fresh tomato at the height of summer will release juices, and as we chew, releasing both saliva and digestive enzymes, our perception of the tomato’s flavour will change.</p> <p> ֱ̽robot chef, which has already been trained to make omelettes based on human taster’s feedback, tasted nine different variations of a simple dish of scrambled eggs and tomatoes at three different stages of the chewing process, and produced ‘taste maps’ of the different dishes.</p> <p> ֱ̽researchers found that this ‘taste as you go’ approach significantly improved the robot’s ability to quickly and accurately assess the saltiness of the dish over other electronic tasting technologies, which only test a single homogenised sample. ֱ̽<a href="https://www.frontiersin.org/articles/10.3389/frobt.2022.886074/abstract">results</a> are reported in the journal <em>Frontiers in Robotics &amp; AI</em>.</p> <p> ֱ̽perception of taste is a complex process in humans that has evolved over millions of years: the appearance, smell, texture and temperature of food all affect how we perceive taste; the saliva produced during chewing helps carry chemical compounds in food to taste receptors mostly on the tongue; and the signals from taste receptors are passed to the brain. Once our brains are aware of the flavour, we decide whether we enjoy the food or not.</p> <p>Taste is also highly individual: some people love spicy food, while others have a sweet tooth. A good cook, whether amateur or professional, relies on their sense of taste, and can balance the various flavours within a dish to make a well-rounded final product.</p> <p>“Most home cooks will be familiar with the concept of tasting as you go – checking a dish throughout the cooking process to check whether the balance of flavours is right,” said Grzegorz Sochacki from Cambridge’s Department of Engineering, the paper’s first author. “If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking.”</p> <p>“When we taste, the process of chewing also provides continuous feedback to our brains,” said co-author Dr Arsen Abdulali, also from the Department of Engineering. “Current methods of electronic testing only take a single snapshot from a homogenised sample, so we wanted to replicate a more realistic process of chewing and tasting in a robotic system, which should result in a tastier end product.”</p> <p> ֱ̽researchers are members of Cambridge’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a> run by <a href="http://mi.eng.cam.ac.uk/Main/FI224">Professor Fumiya Iida</a> of the Department of Engineering, which focuses on training robots to carry out the so-called last metre problems which humans find easy, but robots find difficult. Cooking is one of these tasks: earlier tests with their robot ‘chef’ have produced a passable omelette using feedback from human tasters.</p> <p>“We needed something cheap, small and fast to add to our robot so it could do the tasting: it needed to be cheap enough to use in a kitchen, small enough for a robot, and fast enough to use while cooking,” said Sochacki.</p> <p>To imitate the human process of chewing and tasting in their robot chef, the researchers attached a conductance probe, which acts as a salinity sensor, to a robot arm. They prepared scrambled eggs and tomatoes, varying the number of tomatoes and the amount of salt in each dish.</p> <p>Using the probe, the robot ‘tasted’ the dishes in a grid-like fashion, returning a reading in just a few seconds.</p> <p>To imitate the change in texture caused by chewing, the team then put the egg mixture in a blender and had the robot test the dish again. ֱ̽different readings at different points of ‘chewing’ produced taste maps of each dish.</p> <p>Their results showed a significant improvement in the ability of robots to assess saltiness over other electronic tasting methods, which are often time-consuming and only provide a single reading.</p> <p>While their technique is a proof of concept, the researchers say that by imitating the human processes of chewing and tasting, robots will eventually be able to produce food that humans will enjoy and could be tweaked according to individual tastes.</p> <p>“When a robot is learning how to cook, like any other cook, it needs indications of how well it did,” said Abdulali. “We want the robots to understand the concept of taste, which will make them better cooks. In our experiment, the robot can ‘see’ the difference in the food as it’s chewed, which improves its ability to taste.”</p> <p>“Beko has a vision to bring robots to the home environment which are safe and easy to use,” said Dr Muhammad W. Chughtai, Senior Scientist at Beko plc. “We believe that the development of robotic chefs will play a major role in busy households and assisted living homes in the future. This result is a leap forward in robotic cooking, and by using machine and deep learning algorithms, mastication will help robot chefs adjust taste for different dishes and users.”</p> <p>In future, the researchers are looking to improve the robot chef so it can taste different types of food and improve sensing capabilities so it can taste sweet or oily food, for example.</p> <p> ֱ̽research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC) Centre of Doctoral Training on Agri-Food Robotics (Agriforwards CDT). EPSRC is part of UK Research and Innovation (UKRI). Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.</p> <p> </p> <p><em><strong>Reference:</strong><br /> Grzegorz Sochacki, Arsen Abdulali, and Fumiya Iida. ‘<a href="https://www.frontiersin.org/articles/10.3389/frobt.2022.886074/abstract">Mastication-Enhanced Taste-Based Classification of Multi-Ingredient Dishes for Robotic Cooking</a>.’ Frontiers in Robotics &amp; AI (2022). DOI: 10.3389/frobt.2022.886074</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A robot ‘chef’ has been trained to taste food at different stages of the chewing process to assess whether it’s sufficiently seasoned.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Grzegorz Sochacki</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-194681" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/taste-of-the-future-robot-chef-learns-to-taste-as-you-go">Taste of the future: robot chef learns to ‘taste as you go’</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-2 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/nZ0xviqzUJg?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 04 May 2022 04:00:00 +0000 sc604 231861 at