ֱ̽ of Cambridge - Josie Hughes /taxonomy/people/josie-hughes en Robot uses machine learning to harvest lettuce /research/news/robot-uses-machine-learning-to-harvest-lettuce <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/crop_123.jpg?itok=A20f9Gef" alt="A robot arm picking lettuces" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽‘Vegebot’, developed by a team at the ֱ̽ of Cambridge, was initially trained to recognise and harvest iceberg lettuce in a lab setting. It has now been successfully tested in a variety of field conditions in cooperation with G’s Growers, a local fruit and vegetable co-operative.</p>&#13; &#13; <p>Although the prototype is nowhere near as fast or efficient as a human worker, it demonstrates how the use of robotics in agriculture might be expanded, even for crops like iceberg lettuce which are particularly challenging to harvest mechanically. ֱ̽<a href="https://doi.org/10.1002/rob.21888">results</a> are published in <em> ֱ̽Journal of Field Robotics</em>.</p>&#13; &#13; <p>Crops such as potatoes and wheat have been harvested mechanically at scale for decades, but many other crops have to date resisted automation. Iceberg lettuce is one such crop. Although it is the most common type of lettuce grown in the UK, iceberg is easily damaged and grows relatively flat to the ground, presenting a challenge for robotic harvesters.</p>&#13; &#13; <p>“Every field is different, every lettuce is different,” said co-author Simon Birrell from Cambridge’s Department of Engineering. “But if we can make a robotic harvester work with iceberg lettuce, we could also make it work with many other crops.”</p>&#13; &#13; <p>“At the moment, harvesting is the only part of the lettuce life cycle that is done manually, and it’s very physically demanding,” said co-author Julia Cai, who worked on the computer vision components of the Vegebot while she was an undergraduate student in the lab of Dr Fumiya Iida.</p>&#13; &#13; <p> ֱ̽Vegebot first identifies the ‘target’ crop within its field of vision, then determines whether a particular lettuce is healthy and ready to be harvested, and finally cuts the lettuce from the rest of the plant without crushing it so that it is ‘supermarket ready’. “For a human, the entire process takes a couple of seconds, but it’s a really challenging problem for a robot,” said co-author Josie Hughes.</p>&#13; &#13; <p> ֱ̽Vegebot has two main components: a computer vision system and a cutting system. ֱ̽overhead camera on the Vegebot takes an image of the lettuce field and first identifies all the lettuces in the image, and then for each lettuce, classifies whether it should be harvested or not. A lettuce might be rejected because it’s not yet mature, or it might have a disease that could spread to other lettuces in the harvest.</p>&#13; &#13; <p> ֱ̽researchers developed and trained a machine learning algorithm on example images of lettuces. Once the Vegebot could recognise healthy lettuces in the lab, it was then trained in the field, in a variety of weather conditions, on thousands of real lettuces.</p>&#13; &#13; <p>A second camera on the Vegebot is positioned near the cutting blade and helps ensure a smooth cut. ֱ̽researchers were also able to adjust the pressure in the robot’s gripping arm so that it held the lettuce firmly enough not to drop it, but not so firm as to crush it. ֱ̽force of the grip can be adjusted for other crops.</p>&#13; &#13; <p>“We wanted to develop approaches that weren’t necessarily specific to iceberg lettuce so that they can be used for other types of above-ground crops,” said Iida, who leads the team behind the research.</p>&#13; &#13; <p>In future, robotic harvesters could help address problems with labour shortages in agriculture, and could also help reduce food waste. At the moment, each field is typically harvested once, and any unripe vegetables or fruits are discarded. However, a robotic harvester could be trained to pick only ripe vegetables, and since it could harvest around the clock, it could perform multiple passes on the same field, returning at a later date to harvest the vegetables that were unripe during previous passes.</p>&#13; &#13; <p>“We’re also collecting lots of data about lettuce, which could be used to improve efficiency, such as which fields have the highest yields,” said Hughes. “We’ve still got to speed our Vegebot up to the point where it could compete with a human, but we think robots have lots of potential in agri-tech.”</p>&#13; &#13; <p>Iida’s group at Cambridge is also part of the world’s first <a href="https://agriforwards-cdt.blogs.lincoln.ac.uk/">Centre for Doctoral Training (CDT) in agri-food robotics</a>. In collaboration with researchers at the ֱ̽ of Lincoln and the ֱ̽ of East Anglia, the Cambridge researchers will train the next generation of specialists in robotics and autonomous systems for application in the agri-tech sector. ֱ̽Engineering and Physical Sciences Research Council (EPSRC) has awarded £6.6m for the new CDT, which will support at least 50 PhD students.</p>&#13; &#13; <p><strong><em>Reference:</em></strong><br /><em>Simon Birrell et al. ‘<a href="https://doi.org/10.1002/rob.21888">A Field Tested Robotic Harvesting System for Iceberg Lettuce</a>.’ Journal of Field Robotics (2019). DOI: 10.1002/rob.21888</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A vegetable-picking robot that uses machine learning to identify and harvest a commonplace, but challenging, agricultural crop has been developed by engineers.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">For a human, the entire process takes a couple of seconds, but it’s a really challenging problem for a robot</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Josie Hughes</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-149402" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/149402">Robot uses machine learning to harvest lettuce</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/EFC3OvkVKaQ?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Sun, 07 Jul 2019 23:00:59 +0000 sc604 206322 at 3D-printed robot hand ‘plays’ the piano /research/news/3d-printed-robot-hand-plays-the-piano <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/crop_100.jpg?itok=ktWk90-C" alt="Robot hand playing the piano" title="Robot hand playing the piano, Credit: Josie Hughes" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽robot hand, developed by researchers at the ֱ̽ of Cambridge, was made by 3D-printing soft and rigid materials together to replicate of all the bones and ligaments – but not the muscles or tendons – in a human hand. Even though this limited the robot hand’s range of motion compared to a human hand, the researchers found that a surprisingly wide range of movement was still possible by relying on the hand’s mechanical design.</p>&#13; &#13; <p>Using this ‘passive’ movement – in which the fingers cannot move independently – the robot was able to mimic different styles of piano playing without changing the material or mechanical properties of the hand. ֱ̽<a href="https://dx.doi.org/10.1126/scirobotics.aau3098">results</a>, reported in the journal <em>Science Robotics</em>, could help inform the design of robots that are capable of more natural movement with minimal energy use.</p>&#13; &#13; <p>Complex movement in animals and machines results from the interplay between the brain (or controller), the environment and the mechanical body. ֱ̽mechanical properties and design of systems are important for intelligent functioning, and help both animals and machines to move in complex ways without expending unnecessary amounts of energy.</p>&#13; &#13; <p>“We can use passivity to achieve a wide range of movement in robots: walking, swimming or flying, for example,” said Josie Hughes from Cambridge’s Department of Engineering, the paper’s first author. “Smart mechanical design enables us to achieve the maximum range of movement with minimal control costs: we wanted to see just how much movement we could get with mechanics alone.”</p>&#13; &#13; <p>Over the past several years, soft components have begun to be integrated into robotics design thanks to advances in 3D printing techniques, which has allowed researchers to add complexity to these passive systems.</p>&#13; &#13; <p> ֱ̽human hand is incredibly complex, and recreating all of its dexterity and adaptability in a robot is a massive research challenge. Most of today’s advanced robots are not capable of manipulation tasks which small children can perform with ease.</p>&#13; &#13; <p>“ ֱ̽basic motivation of this project is to understand embodied intelligence, that is, the intelligence in our mechanical body,” said Dr Fumiya Iida, who led the research. “Our bodies consist of smart mechanical designs such as bones, ligaments, and skins that help us behave intelligently even without active brain-led control. By using the state-of-the-art 3D printing technology to print human-like soft hands, we are now able to explore the importance of physical designs, in isolation from active control, which is impossible to do with human piano players as the brain cannot be ‘switched off’ like our robot.”</p>&#13; &#13; <p>“Piano playing is an ideal test for these passive systems, as it’s a complex and nuanced challenge requiring a significant range of behaviours in order to achieve different playing styles,” said Hughes.</p>&#13; &#13; <p> ֱ̽robot was ‘taught’ to play by considering how the mechanics, material properties, environment and wrist actuation all affect the dynamic model of the hand. By actuating the wrist, it is possible to choose how the hand interacts with the piano, allowing the embodied intelligence of the hand to determine how it interacts with the environment. </p>&#13; &#13; <p> ֱ̽researchers programmed the robot to play a number of short musical phrases with clipped (staccato) or smooth (legato) notes, achieved through the movement of the wrist. “It’s just the basics at this point, but even with this single movement, we can still get quite complex and nuanced behaviour,” said Hughes.</p>&#13; &#13; <p>Despite the limitations of the robot hand, the researchers say their approach will drive further research into the underlying principles of skeletal dynamics to achieve complex movement tasks, as well as learning where the limitations for passive movement systems lie.</p>&#13; &#13; <p>“This approach to mechanical design can change how we build robotics,” said Iida. “ ֱ̽fabrication approach allows us to design mechanically intelligent structures in a way that is highly scalable.”</p>&#13; &#13; <p>“We can extend this research to investigate how we can achieve even more complex manipulation tasks: developing robots which can perform medical procedures or handle fragile objects, for instance,” said Hughes. “This approach also reduces the amount of machine learning required to control the hand; by developing mechanical systems with intelligence built in, it makes control much easier for robots to learn.”</p>&#13; &#13; <p> ֱ̽research was funded by the Engineering and Physical Sciences Research Council (EPSRC).</p>&#13; &#13; <p><strong><em>Reference:</em></strong><br /><em>J.A.E. Hughes, P. Maiolino, F. Iida. ‘<a href="https://dx.doi.org/10.1126/scirobotics.aau3098">An Anthropomorphic Soft Skeleton Hand Exploiting Conditional Models for Piano Playing</a>.’ Science Robotics (2018). DOI: 10.1126/scirobotics.aau3098</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Scientists have developed a 3D-printed robotic hand which can play simple musical phrases on the piano by just moving its wrist. And while the robot is no virtuoso, it demonstrates just how challenging it is to replicate all the abilities of a human hand, and how much complex movement can still be achieved through design. </p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Smart mechanical design enables us to achieve the maximum range of movement with minimal control costs: we wanted to see just how much movement we could get with mechanics alone</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Josie Hughes</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-143562" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/143562">3D-printed robot hand ‘plays’ the piano</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-2 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/XDgffOW6ZzQ?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Josie Hughes</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robot hand playing the piano</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 19 Dec 2018 19:00:00 +0000 sc604 202232 at