ֱ̽ of Cambridge - Beko plc /taxonomy/external-affiliations/beko-plc en Taste of the future: robot chef learns to ‘taste as you go’ /research/news/taste-of-the-future-robot-chef-learns-to-taste-as-you-go <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/chef.jpg?itok=zwU4FEoU" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Working in collaboration with domestic appliances manufacturer Beko, researchers from the ֱ̽ of Cambridge trained their robot chef to assess the saltiness of a dish at different stages of the chewing process, imitating a similar process in humans.</p> <p>Their results could be useful in the development of automated or semi-automated food preparation by helping robots to learn what tastes good and what doesn’t, making them better cooks.</p> <p>When we chew our food, we notice a change in texture and taste. For example, biting into a fresh tomato at the height of summer will release juices, and as we chew, releasing both saliva and digestive enzymes, our perception of the tomato’s flavour will change.</p> <p> ֱ̽robot chef, which has already been trained to make omelettes based on human taster’s feedback, tasted nine different variations of a simple dish of scrambled eggs and tomatoes at three different stages of the chewing process, and produced ‘taste maps’ of the different dishes.</p> <p> ֱ̽researchers found that this ‘taste as you go’ approach significantly improved the robot’s ability to quickly and accurately assess the saltiness of the dish over other electronic tasting technologies, which only test a single homogenised sample. ֱ̽<a href="https://www.frontiersin.org/articles/10.3389/frobt.2022.886074/abstract">results</a> are reported in the journal <em>Frontiers in Robotics &amp; AI</em>.</p> <p> ֱ̽perception of taste is a complex process in humans that has evolved over millions of years: the appearance, smell, texture and temperature of food all affect how we perceive taste; the saliva produced during chewing helps carry chemical compounds in food to taste receptors mostly on the tongue; and the signals from taste receptors are passed to the brain. Once our brains are aware of the flavour, we decide whether we enjoy the food or not.</p> <p>Taste is also highly individual: some people love spicy food, while others have a sweet tooth. A good cook, whether amateur or professional, relies on their sense of taste, and can balance the various flavours within a dish to make a well-rounded final product.</p> <p>“Most home cooks will be familiar with the concept of tasting as you go – checking a dish throughout the cooking process to check whether the balance of flavours is right,” said Grzegorz Sochacki from Cambridge’s Department of Engineering, the paper’s first author. “If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking.”</p> <p>“When we taste, the process of chewing also provides continuous feedback to our brains,” said co-author Dr Arsen Abdulali, also from the Department of Engineering. “Current methods of electronic testing only take a single snapshot from a homogenised sample, so we wanted to replicate a more realistic process of chewing and tasting in a robotic system, which should result in a tastier end product.”</p> <p> ֱ̽researchers are members of Cambridge’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a> run by <a href="http://mi.eng.cam.ac.uk/Main/FI224">Professor Fumiya Iida</a> of the Department of Engineering, which focuses on training robots to carry out the so-called last metre problems which humans find easy, but robots find difficult. Cooking is one of these tasks: earlier tests with their robot ‘chef’ have produced a passable omelette using feedback from human tasters.</p> <p>“We needed something cheap, small and fast to add to our robot so it could do the tasting: it needed to be cheap enough to use in a kitchen, small enough for a robot, and fast enough to use while cooking,” said Sochacki.</p> <p>To imitate the human process of chewing and tasting in their robot chef, the researchers attached a conductance probe, which acts as a salinity sensor, to a robot arm. They prepared scrambled eggs and tomatoes, varying the number of tomatoes and the amount of salt in each dish.</p> <p>Using the probe, the robot ‘tasted’ the dishes in a grid-like fashion, returning a reading in just a few seconds.</p> <p>To imitate the change in texture caused by chewing, the team then put the egg mixture in a blender and had the robot test the dish again. ֱ̽different readings at different points of ‘chewing’ produced taste maps of each dish.</p> <p>Their results showed a significant improvement in the ability of robots to assess saltiness over other electronic tasting methods, which are often time-consuming and only provide a single reading.</p> <p>While their technique is a proof of concept, the researchers say that by imitating the human processes of chewing and tasting, robots will eventually be able to produce food that humans will enjoy and could be tweaked according to individual tastes.</p> <p>“When a robot is learning how to cook, like any other cook, it needs indications of how well it did,” said Abdulali. “We want the robots to understand the concept of taste, which will make them better cooks. In our experiment, the robot can ‘see’ the difference in the food as it’s chewed, which improves its ability to taste.”</p> <p>“Beko has a vision to bring robots to the home environment which are safe and easy to use,” said Dr Muhammad W. Chughtai, Senior Scientist at Beko plc. “We believe that the development of robotic chefs will play a major role in busy households and assisted living homes in the future. This result is a leap forward in robotic cooking, and by using machine and deep learning algorithms, mastication will help robot chefs adjust taste for different dishes and users.”</p> <p>In future, the researchers are looking to improve the robot chef so it can taste different types of food and improve sensing capabilities so it can taste sweet or oily food, for example.</p> <p> ֱ̽research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC) Centre of Doctoral Training on Agri-Food Robotics (Agriforwards CDT). EPSRC is part of UK Research and Innovation (UKRI). Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.</p> <p> </p> <p><em><strong>Reference:</strong><br /> Grzegorz Sochacki, Arsen Abdulali, and Fumiya Iida. ‘<a href="https://www.frontiersin.org/articles/10.3389/frobt.2022.886074/abstract">Mastication-Enhanced Taste-Based Classification of Multi-Ingredient Dishes for Robotic Cooking</a>.’ Frontiers in Robotics &amp; AI (2022). DOI: 10.3389/frobt.2022.886074</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A robot ‘chef’ has been trained to taste food at different stages of the chewing process to assess whether it’s sufficiently seasoned.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Grzegorz Sochacki</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-194681" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/taste-of-the-future-robot-chef-learns-to-taste-as-you-go">Taste of the future: robot chef learns to ‘taste as you go’</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/nZ0xviqzUJg?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 04 May 2022 04:00:00 +0000 sc604 231861 at A good egg: robot chef trained to make omelettes /research/news/a-good-egg-robot-chef-trained-to-make-omelettes <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/sequence0100013123still007.jpg?itok=ovls7YU1" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge in collaboration with domestic appliance company Beko, used machine learning to train the robot to account for highly subjective matters of taste. ֱ̽<a href="https://ieeexplore.ieee.org/document/8954776">results</a> are reported in the journal <em>IEEE Robotics and Automation Letters</em>, and will be available online as part of the virtual <a href="https://www.icra2020.org/program/program-overview">IEEE International Conference on Robotics and Automation</a> (ICRA 2020).</p>&#13; &#13; <p>A robot that can cook has been an aspiration of sci-fi authors, futurists, and scientists for decades. As artificial intelligence techniques have advanced, commercial companies have built prototype robot chefs, although none of these are currently commercially available, and they lag well behind their human counterparts in terms of skill.</p>&#13; &#13; <p>“Cooking is a really interesting problem for roboticists, as humans can never be totally objective when it comes to food, so how do we as scientists assess whether the robot has done a good job?” said Dr Fumiya Iida from Cambridge’s Department of Engineering, who led the research.</p>&#13; &#13; <p>Teaching a robot to prepare and cook food is a challenging task, since it must deal with complex problems in robot manipulation, computer vision, sensing and human-robot interaction, and produce a consistent end product.</p>&#13; &#13; <p>In addition, taste differs from person to person – cooking is a qualitative task, while robots generally excel at quantitative tasks. Since taste is not universal, universal solutions don’t exist. Unlike other optimisation problems, special tools need to be developed for robots to prepare food.</p>&#13; &#13; <p>Other research groups have trained robots to make cookies, pancakes and even pizza, but these robot chefs have not been optimised for the many subjective variables involved in cooking.</p>&#13; &#13; <p>Egg dishes, omelettes in particular, have long been considered a test of culinary skill. A popular piece of French culinary mythology states that each of the one hundred pleats in a chef’s hat represents a different way to cook an egg, although the exact origin of this adage is unknown.</p>&#13; &#13; <p>“An omelette is one of those dishes that is easy to make, but difficult to make well,” said Iida. “We thought it would be an ideal test to improve the abilities of a robot chef, and optimise for taste, texture, smell and appearance.”</p>&#13; &#13; <p>In partnership with <a href="https://www.beko.co.uk/">Beko</a>, Iida and his colleagues trained their robot chef to prepare an omelette, from cracking the eggs through to plating the finished dish. ֱ̽work was performed in Cambridge’s Department of Engineering, using a test kitchen supplied by Beko plc and Symphony Group.</p>&#13; &#13; <p> ֱ̽machine learning technique developed by Iida’s team makes use of a statistical tool, called Bayesian Inference, to squeeze out as much information as possible from the limited amount of data samples, which was necessary to avoid over-stuffing the human tasters with omelettes.</p>&#13; &#13; <p>“Another challenge we faced was the subjectivity of human sense of taste - humans aren’t very good at giving absolute measures, and usually give relative ones when it comes to taste,” said Iida. “So we needed to tweak the machine learning algorithm - the so-called batch algorithm - so that human tasters could give information based on comparative evaluations, rather than sequential ones.”</p>&#13; &#13; <p>But how did the robot measure up as a chef? “ ֱ̽omelettes, in general, tasted great – much better than expected!” said Iida.</p>&#13; &#13; <p> ֱ̽results show that machine learning can be used to obtain quantifiable improvements in food optimisation. Additionally, such an approach can be easily extended to multiple robotic chefs. Further studies have to be conducted to investigate other optimisation techniques and their viability.</p>&#13; &#13; <p>“Beko is passionate about designing the kitchen of the future and believes robotics applications such as this will play a crucial part. We are very happy to be collaborating with Dr Iida on this important topic,” said Dr Graham Anderson, the industrial project supervisor from Beko’s Cambridge R&amp;D Centre.</p>&#13; &#13; <p><strong><em>Reference:</em></strong><br /><em>Kai Junge et al. ‘</em><a href="https://ieeexplore.ieee.org/document/8954776"><em>Improving Robotic Cooking using Batch Bayesian Optimization</em></a><em>.’ IEEE Robotics and Automation Letters (2020). DOI: 10.1109/LRA.2020.2965418</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A team of engineers have trained a robot to prepare an omelette, all the way from cracking the eggs to plating the finished dish, and refined the ‘chef’s’ culinary skills to produce a reliable dish that actually tastes good.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Cooking is a really interesting problem for roboticists, as humans can never be totally objective when it comes to food, so how do we as scientists assess whether the robot has done a good job?</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Fumiya Iida</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-161752" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/161752">Can robots make omelettes?</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-2 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/G99hk9Tfw9M?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 01 Jun 2020 11:40:51 +0000 sc604 215002 at