ֱ̽ of Cambridge - robotics /taxonomy/subjects/robotics en ‘Palaeo-robots’ to help scientists understand how fish started to walk on land /research/news/palaeo-robots-to-help-scientists-understand-how-fish-started-to-walk-on-land <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/paleo-robots-883x432.jpg?itok=rSGMB0cY" alt="Illustration of palaeo-robots." title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://doi.org/10.1126/scirobotics.adn1125">Writing</a> in the journal <em>Science Robotics</em>, the research team, led by the ֱ̽ of Cambridge, outline how ‘palaeo-inspired robotics’ could provide a valuable experimental approach to studying how the pectoral and pelvic fins of ancient fish evolved to support weight on land.</p> <p>“Since fossil evidence is limited, we have an incomplete picture of how ancient life made the transition to land,” said lead author <a href="https://www.michaelishida.com/">Dr Michael Ishida</a> from Cambridge’s Department of Engineering. “Palaeontologists examine ancient fossils for clues about the structure of hip and pelvic joints, but there are limits to what we can learn from fossils alone. That’s where robots can come in, helping us fill gaps in the research, particularly when studying major shifts in how vertebrates moved.”</p> <p>Ishida is a member of Cambridge’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, led by Professor Fumiya Iida. ֱ̽team is developing energy-efficient robots for a variety of applications, which take their inspiration from the efficient ways that animals and humans move.</p> <p>With funding from the Human Frontier Science Program, the team is developing palaeo-inspired robots, in part by taking their inspiration from modern-day ‘walking fish’ such as mudskippers, and from fossils of extinct fish. “In the lab, we can’t make a living fish walk differently, and we certainly can’t get a fossil to move, so we’re using robots to simulate their anatomy and behaviour,” said Ishida.</p> <p> ֱ̽team is creating robotic analogues of ancient fish skeletons, complete with mechanical joints that mimic muscles and ligaments. Once complete, the team will perform experiments on these robots to determine how these ancient creatures might have moved.</p> <p>“We want to know things like how much energy different walking patterns would have required, or which movements were most efficient,” said Ishida. “This data can help confirm or challenge existing theories about how these early animals evolved.”</p> <p>One of the biggest challenges in this field is the lack of comprehensive fossil records. Many of the ancient species from this period in Earth’s history are known only from partial skeletons, making it difficult to reconstruct their full range of movement.</p> <p>“In some cases, we’re just guessing how certain bones connected or functioned,” said Ishida. “That’s why robots are so useful—they help us confirm these guesses and provide new evidence to support or rebut them.”</p> <p>While robots are commonly used to study movement in living animals, very few research groups are using them to study extinct species. “There are only a few groups doing this kind of work,” said Ishida. “But we think it’s a natural fit – robots can provide insights into ancient animals that we simply can’t get from fossils or modern species alone.”</p> <p> ֱ̽team hopes that their work will encourage other researchers to explore the potential of robotics to study the biomechanics of long-extinct animals. “We’re trying to close the loop between fossil evidence and real-world mechanics,” said Ishida. “Computer models are obviously incredibly important in this area of research, but since robots are interacting with the real world, they can help us test theories about how these creatures moved, and maybe even why they moved the way they did.”</p> <p> ֱ̽team is currently in the early stages of building their palaeo-robots, but they hope to have some results within the next year. ֱ̽researchers say they hope their robot models will not only deepen understanding of evolutionary biology, but could also open up new avenues of collaboration between engineers and researchers in other fields.</p> <p> ֱ̽research was supported by the Human Frontier Science Program. Fumiya Iida is a Fellow of Corpus Christi College, Cambridge. Michael Ishida a Postdoctoral Research Associate at Gonville and Caius College, Cambridge.</p> <p><em><strong>Reference:</strong><br /> Michael Ishida et al. ‘<a href="https://doi.org/10.1126/scirobotics.adn1125">Paleo-inspired robotics as an experimental approach to the history of life</a>.’ Science Robotics (2024). DOI: 10.1126/scirobotics.adn1125</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽transition from water to land is one of the most significant events in the history of life on Earth. Now, a team of roboticists, palaeontologists and biologists is using robots to study how the ancestors of modern land animals transitioned from swimming to walking, about 390 million years ago.</p> </p></div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 23 Oct 2024 18:00:00 +0000 sc604 248514 at Getting to grips with an extra thumb /stories/third-thumb <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge researchers have shown that members of the public have little trouble in learning very quickly how to use a third thumb – a controllable, prosthetic extra thumb – to pick up and manipulate objects.</p> </p></div></div></div> Wed, 29 May 2024 18:00:58 +0000 cjb250 246171 at Robotic nerve ‘cuffs’ could help treat a range of neurological conditions /research/news/robotic-nerve-cuffs-could-help-treat-a-range-of-neurological-conditions <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/gettyimages-1457349338-dp.jpg?itok=dTF6w9Hu" alt="Illustration of the human nervous system" title="Illustration of the human nervous system, Credit: XH4D via iStock / Getty Images Plus" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, combined flexible electronics and soft robotics techniques to develop the devices, which could be used for the diagnosis and treatment of a range of disorders, including epilepsy and chronic pain, or the control of prosthetic limbs.</p> <p>Current tools for interfacing with the peripheral nerves – the 43 pairs of motor and sensory nerves that connect the brain and the spinal cord – are outdated, bulky and carry a high risk of nerve injury. However, the robotic nerve ‘cuffs’ developed by the Cambridge team are sensitive enough to grasp or wrap around delicate nerve fibres without causing any damage.</p> <p>Tests of the nerve cuffs in rats showed that the devices only require tiny voltages to change shape in a controlled way, forming a self-closing loop around nerves without the need for surgical sutures or glues.</p> <p> ֱ̽researchers say the combination of soft electrical actuators with neurotechnology could be an answer to minimally invasive monitoring and treatment for a range of neurological conditions. ֱ̽<a href="https://www.nature.com/articles/s41563-024-01886-0">results</a> are reported in the journal <em>Nature Materials</em>.</p> <p>Electric nerve implants can be used to either stimulate or block signals in target nerves. For example, they might help relieve pain by blocking pain signals, or they could be used to restore movement in paralysed limbs by sending electrical signals to the nerves. Nerve monitoring is also standard surgical procedure when operating in areas of the body containing a high concentration of nerve fibres, such as anywhere near the spinal cord.</p> <p>These implants allow direct access to nerve fibres, but they come with certain risks. “Nerve implants come with a high risk of nerve injury,” said Professor George Malliaras from Cambridge’s Department of Engineering, who led the research. “Nerves are small and highly delicate, so anytime you put something large, like an electrode, in contact with them, it represents a danger to the nerves.”</p> <p>“Nerve cuffs that wrap around nerves are the least invasive implants currently available, but despite this they are still too bulky, stiff and difficult to implant, requiring significant handling and potential trauma to the nerve,” said co-author Dr Damiano Barone from Cambridge’s Department of Clinical Neurosciences.</p> <p> ֱ̽researchers designed a new type of nerve cuff made from conducting polymers, normally used in soft robotics. ֱ̽ultra-thin cuffs are engineered in two separate layers. Applying tiny amounts of electricity – just a few hundred millivolts – causes the devices to swell or shrink.</p> <p> ֱ̽cuffs are small enough that they could be rolled up into a needle and injected near the target nerve. When activated electrically, the cuffs will change their shape to wrap around the nerve, allowing nerve activity to be monitored or altered.</p> <p>“To ensure the safe use of these devices inside the body, we have managed to reduce the voltage required for actuation to very low values,” said Dr Chaoqun Dong, the paper’s first author. “What's even more significant is that these cuffs can change shape in both directions and be reprogrammed. This means surgeons can adjust how tightly the device fits around a nerve until they get the best results for recording and stimulating the nerve.”</p> <p>Tests in rats showed that the cuffs could be successfully placed without surgery, and formed a self-closing loop around the target nerve. ֱ̽researchers are planning further testing of the devices in animal models, and are hoping to begin testing in humans within the next few years.</p> <p>“Using this approach, we can reach nerves that are difficult to reach through open surgery, such as the nerves that control, pain, vision or hearing, but without the need to implant anything inside the brain,” said Barone. “ ֱ̽ability to place these cuffs so they wrap around the nerves makes this a much easier procedure for surgeons, and it’s less risky for patients.”</p> <p>“ ֱ̽ability to make an implant that can change shape through electrical activation opens up a range of future possibilities for highly targeted treatments,” said Malliaras. “In future, we might be able to have implants that can move through the body, or even into the brain – it makes you dream how we could use technology to benefit patients in future.”</p> <p> ֱ̽research was supported in part by the Swiss National Science Foundation, the Cambridge Trust, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).</p> <p> </p> <p><em><strong>Reference:</strong><br /> Chaoqun Dong et al. ‘<a href="https://www.nature.com/articles/s41563-024-01886-0">Electrochemically actuated microelectrodes for minimally invasive peripheral nerve interfaces</a>.’ Nature Materials (2024). DOI: 10.1038/s41563-024-01886-0</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed tiny, flexible devices that can wrap around individual nerve fibres without damaging them.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"> ֱ̽ability to make an implant that can change shape through electrical activation opens up a range of future possibilities for highly targeted treatments</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">George Malliaras</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">XH4D via iStock / Getty Images Plus</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Illustration of the human nervous system</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 26 Apr 2024 08:55:34 +0000 sc604 245801 at Opinion: the future of science is automation /research/news/opinion-the-future-of-science-is-automation <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/gettyimages-1395524709-dp.jpg?itok=iwMn4UQt" alt="Robot arm handling test tubes." title="Robot arm handling test tubes., Credit: kynny via Getty Images" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Thanks to the widespread availability of food and medical care, the ability to travel, and many other scientific and technological developments, billions of people today are living better lives than kings of centuries past. It is deeply surprising to me how little appreciated this astonishing fact is.</p> <p>Of course, despite all the progress we’ve made, the world faces many challenges in the 21st century: climate change, pandemics, poverty and cancer, to name just a few.</p> <p>If all the countries in the world could join together to share technology and resources, we might be to deal with and overcome these challenges. However, history presents no example of such collaboration, and the current geopolitical situation does not offer much in the way of hope.</p> <p>Our best hope of dealing with these challenges is to make science and technology more productive. ֱ̽only feasible way to achieve this is through the integration of Artificial Intelligence (AI) and laboratory automation.</p> <p>AI systems already possess superhuman scientific powers. They can remember massive volumes of facts and learn from huge datasets. They can execute flawless logical reasoning, and near optimal probabilistic reasoning. They are can read every scientific paper, indeed everything ever written. These powers are complimentary to human scientists.</p> <p>When the scientific method was developed in the 17th century, one of the core insights was the need to conduct experiments in the physical world, not just to think.</p> <p>Today, laboratory automation is steadily advancing, and robots can now carry out most of the laboratory tasks that humans can. We are also now seeing the emergence of the ‘Cloud Lab’ concept. ֱ̽idea is to provide laboratory automation at scale and remotely, with scientists sending their samples to the cloud lab, using a computer interface to design and execute their experiments.</p> <p>And then there are AI Scientists: AI systems integrated with laboratory automations that are capable of carrying out the closed-loop automation of scientific research (aka 'Robot Scientists', 'Self-driving Labs'). These systems automatically originate hypotheses to explain observations, devise experiments to test these hypotheses, physically run these experiments using laboratory robotics, interpret the results, and then repeat the cycle.</p> <p>AI Scientists can work cheaper, faster, more accurately, and longer than humans. They can also be easily multiplied. As the experiments are conceived and executed automatically by computer, it’s possible to completely capture and digitally curate all aspects of the scientific process, making the science more reproducible. There are now around 100 AI Scientists around the world, working in areas from quantum mechanics to astronomy, from chemistry to medicine.</p> <p>Within the last year or so the world has been stunned by the success of Large Language Models (LLMs) such as ChatGPT, which have achieved breakthrough performance on a wide range of conversation-based tasks. LLMs are surprisingly strong absorbers of technical knowledge, such as chemical reactions and logical expressions. LLMs, and more broadly Foundation Models, show great potential for super-charging AI Scientists. They can act both as a source of scientific knowledge, since they have read all the scientific literature, and a source of new scientific hypotheses.</p> <p>One of the current problems with LLMs is their tendency to hallucinate, that is to output statements that are not true. While this is a serious problem in many applications, it is not necessarily so in science, where physical experiments are the arbiters of truth. Hallucinations are hypotheses.</p> <p>AI has been used as a tool in the research behind tens of thousands of scientific papers. We believe this only a start. We believe that AI has the potential to transform the very process of science.</p> <p>We believe that by harnessing the power of AI, we can propel humanity toward a future where groundbreaking achievements in science, even achievements worthy of a Nobel Prize, can be fully automated. Such advances could transform science and technology, and provide hope of dealing with the formidable challenges that face humankind in the 21st century</p> <p> ֱ̽<a href="https://www.nobelturingchallenge.org/">Nobel Turing Challenge </a>aims to develop AI Scientists capable of making Nobel-quality scientific discoveries at a level comparable, and possibly superior to the best human scientists by 2050.</p> <p>As well as being a potential transformative power for good, the application of AI to science has potential for harm. As a step towards preventing this harm, my colleagues and I have prepared the Stockholm Declaration on AI for Science. This commits the signees to the responsible and ethical development of AI for science. A copy of the declaration can be signed on <a href="https://sites.google.com/view/stockholm-declaration" title="External link: ֱ̽Stockholm Declaration on AI for Science"> ֱ̽Stockholm Declaration on AI for Science</a> website. </p> <p>We urge all scientists working with AI to sign.</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Professor Ross King from Cambridge's Department of Chemical Engineering and Biotechnology, who originated the idea of a 'Robot Scientist', discusses why he believes that AI-powered scientists could surpass the best human scientists by the middle of the century, but only if artificial intelligence for science is developed responsibly and ethically. </p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">kynny via Getty Images</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robot arm handling test tubes.</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 26 Feb 2024 13:02:43 +0000 Anonymous 244711 at Robot trained to read braille at twice the speed of humans /research/news/robot-trained-to-read-braille-at-twice-the-speed-of-humans <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/img-4841-dp.jpg?itok=RoYah_Zz" alt="Robot braille reader" title="Robot braille reader, Credit: Parth Potdar" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽research team, from the ֱ̽ of Cambridge, used machine learning algorithms to teach a robotic sensor to quickly slide over lines of braille text. ֱ̽robot was able to read the braille at 315 words per minute at close to 90% accuracy.</p> <p>Although the robot braille reader was not developed as an assistive technology, the researchers say the high sensitivity required to read braille makes it an ideal test in the development of robot hands or prosthetics with comparable sensitivity to human fingertips. ֱ̽<a href="https://ieeexplore.ieee.org/document/10410896">results</a> are reported in the journal <em>IEEE Robotics and Automation Letters</em>.</p> <p>Human fingertips are remarkably sensitive and help us gather information about the world around us. Our fingertips can detect tiny changes in the texture of a material or help us know how much force to use when grasping an object: for example, picking up an egg without breaking it or a bowling ball without dropping it.</p> <p>Reproducing that level of sensitivity in a robotic hand, in an energy-efficient way, is a big engineering challenge. In <a href="https://birlab.org/">Professor Fumiya Iida’s lab</a> in Cambridge’s Department of Engineering, researchers are developing solutions to this and other skills that humans find easy, but robots find difficult.</p> <p>“ ֱ̽softness of human fingertips is one of the reasons we’re able to grip things with the right amount of pressure,” said Parth Potdar from Cambridge’s Department of Engineering and an undergraduate at Pembroke College, the paper’s first author. “For robotics, softness is a useful characteristic, but you also need lots of sensor information, and it’s tricky to have both at once, especially when dealing with flexible or deformable surfaces.”</p> <p>Braille is an ideal test for a robot ‘fingertip’ as reading it requires high sensitivity, since the dots in each representative letter pattern are so close together. ֱ̽researchers used an off-the-shelf sensor to develop a robotic braille reader that more accurately replicates human reading behaviour.</p> <p>“There are existing robotic braille readers, but they only read one letter at a time, which is not how humans read,” said co-author David Hardman, also from the Department of Engineering. “Existing robotic braille readers work in a static way: they touch one letter pattern, read it, pull up from the surface, move over, lower onto the next letter pattern, and so on. We want something that’s more realistic and far more efficient.”</p> <p> ֱ̽robotic sensor the researchers used has a camera in its ‘fingertip’, and reads by using a combination of the information from the camera and the sensors. “This is a hard problem for roboticists as there’s a lot of image processing that needs to be done to remove motion blur, which is time and energy-consuming,” said Potdar.</p> <p> ֱ̽team developed machine learning algorithms so the robotic reader would be able to ‘deblur’ the images before the sensor attempted to recognise the letters. They trained the algorithm on a set of sharp images of braille with fake blur applied. After the algorithm had learned to deblur the letters, they used a computer vision model to detect and classify each character.</p> <p>Once the algorithms were incorporated, the researchers tested their reader by sliding it quickly along rows of braille characters. ֱ̽robotic braille reader could read at 315 words per minute at 87% accuracy, which is twice as fast and about as accurate as a human Braille reader.</p> <p>“Considering that we used fake blur the train the algorithm, it was surprising how accurate it was at reading braille,” said Hardman. “We found a nice trade-off between speed and accuracy, which is also the case with human readers.”</p> <p>“Braille reading speed is a great way to measure the dynamic performance of tactile sensing systems, so our findings could be applicable beyond braille, for applications like detecting surface textures or slippage in robotic manipulation,” said Potdar.</p> <p>In future, the researchers are hoping to scale the technology to the size of a humanoid hand or skin. ֱ̽research was supported in part by the Samsung Global Research Outreach Program.</p> <p> </p> <p><em><strong>Reference:</strong><br /> Parth Potdar et al. ‘<a href="https://ieeexplore.ieee.org/document/10410896">High-Speed Tactile Braille Reading via Biomimetic Sliding Interactions</a>.’ IEEE Robotics and Automation Letters (2024). DOI: 10.1109/LRA.2024.3356978</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a robotic sensor that incorporates artificial intelligence techniques to read braille at speeds roughly double that of most human readers.</p> </p></div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-217601" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/can-robots-read-braille">Can robots read braille?</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/xqtA2Z668Ic?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Parth Potdar</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robot braille reader</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 29 Jan 2024 06:04:52 +0000 sc604 244161 at ֱ̽Vice-Chancellor's Awards 2023 for Research Impact and Engagement /public-engagement/vc-awards/2023 <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Meet the winner of the Vice-Chancellor's Awards 2023 for Research Impact and Engagement and learn more about their projects.</p> </p></div></div></div> Wed, 13 Dec 2023 09:20:46 +0000 zs332 243781 at Robots cause company profits to fall – at least at first /research/news/robots-cause-company-profits-to-fall-at-least-at-first <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/gettyimages-1408271637-dp.jpg?itok=uZqWd7Is" alt="Robots on a manufacturing line" title="Robots on a manufacturing line, Credit: kynny via Getty Images" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, studied industry data from the UK and 24 other European countries between 1995 and 2017, and found that at low levels of adoption, robots have a negative effect on profit margins. But at higher levels of adoption, robots can help increase profits.</p>&#13; &#13; <p>According to the researchers, this U-shaped phenomenon is due to the relationship between reducing costs, developing new processes and innovating new products. While many companies first adopt robotic technologies to decrease costs, this ‘process innovation’ can be easily copied by competitors, so at low levels of robot adoption, companies are focused on their competitors rather than on developing new products. However, as levels of adoption increase and robots are fully integrated into a company’s processes, the technologies can be used to increase revenue by innovating new products.</p>&#13; &#13; <p>In other words, firms using robots are likely to focus initially on streamlining their processes before shifting their emphasis to product innovation, which gives them greater market power via the ability to differentiate from their competitors. ֱ̽<a href="https://ieeexplore.ieee.org/document/10202238">results</a> are reported in the journal <em>IEEE Transactions on Engineering Management</em>.</p>&#13; &#13; <p>Robots have been widely used in industry since the 1980s, especially in sectors where they can carry out physically demanding, repetitive tasks, such as automotive assembly. In the decades since, the rate of robot adoption has increased dramatically and consistently worldwide, and the development of precise, electrically controlled robots makes them particularly useful for high-value manufacturing applications requiring greater precision, such as electronics.</p>&#13; &#13; <p>While robots have been shown to reliably raise labour productivity at an industry or country level, what has been less studied is how robots affect profit margins at a similar macro scale.</p>&#13; &#13; <p>“If you look at how the introduction of computers affected productivity, you actually see a slowdown in productivity growth in the 1970s and early 1980s, before productivity starts to rise again, which it did until the financial crisis of 2008,” said co-author Professor Chander Velu from Cambridge’s Institute for Manufacturing. “It’s interesting that a tool meant to increase productivity had the opposite effect, at least at first. We wanted to know whether there is a similar pattern with robotics.”</p>&#13; &#13; <p>“We wanted to know whether companies were using robots to improve processes within the firm, rather than improve the whole business model,” said co-author Dr Philip Chen. “Profit margin can be a useful way to analyse this.”</p>&#13; &#13; <p> ֱ̽researchers examined industry-level data for 25 EU countries (including the UK, which was a member at the time) between 1995 and 2017. While the data did not drill down to the level of individual companies, the researchers were able to look at whole sectors, primarily in manufacturing where robots are commonly used.</p>&#13; &#13; <p> ֱ̽researchers then obtained robotics data from the International Federation of Robotics (IFR) database. By comparing the two sets of data, they were able to analyse the effect of robotics on profit margins at a country level.</p>&#13; &#13; <p>“Intuitively, we thought that more robotic technologies would lead to higher profit margins, but the fact that we see this U-shaped curve instead was surprising,” said Chen.</p>&#13; &#13; <p>“Initially, firms are adopting robots to create a competitive advantage by lowering costs,” said Velu. “But process innovation is cheap to copy, and competitors will also adopt robots if it helps them make their products more cheaply. This then starts to squeeze margins and reduce profit margin.”</p>&#13; &#13; <p> ֱ̽researchers then carried out a series of interviews with an American medical equipment manufacturer to study their experiences with robot adoption.</p>&#13; &#13; <p>“We found that it’s not easy to adopt robotics into a business – it costs a lot of money to streamline and automate processes,” said Chen.</p>&#13; &#13; <p>“When you start bringing more and more robots into your process, eventually you reach a point where your whole process needs to be redesigned from the bottom up,” said Velu. “It’s important that companies develop new processes at the same time as they’re incorporating robots, otherwise they will reach this same pinch point.”</p>&#13; &#13; <p> ֱ̽researchers say that if companies want to reach the profitable side of the U-shaped curve more quickly, it’s important that the business model is adapted concurrently with robot adoption. Only after robots are fully integrated into the business model can companies fully use the power of robotics to develop new products, driving profits.</p>&#13; &#13; <p>A related piece of work being led by the Institute for Manufacturing is a community programme to help small- and medium-sized enterprises (SMEEs) to adopt digital technologies including robotics in a low-cost, low-risk way. “Incremental and step changes in this area enable SMEs to get the benefits of cost reduction as well as margin improvements from new products,” said co-author Professor Duncan McFarlane.</p>&#13; &#13; <p> ֱ̽research was supported by the Engineering and Physical Sciences Research Council (EPSRC) and the Economic and Social Research Council (ESRC), which are both part of UK Research and Innovation (UKRI). Chander Velu is a Fellow of Selwyn College, Cambridge. Duncan McFarlane is a Fellow of St John's College, Cambridge. </p>&#13; &#13; <p> </p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Yifeng P Chen, Chander Velu, Duncan McFarlane. ‘<a href="https://ieeexplore.ieee.org/document/10202238"> ֱ̽Effect of Robot Adoption on Profit Margins</a>.’ IEEE Transactions on Engineering Management (2023). DOI: 10.1109/TEM.2023.3260734</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have found that robots can have a ‘U-shaped’ effect on profits: causing profit margins to fall at first, before eventually rising again.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">It’s important that companies develop new processes at the same time as they’re incorporating robots, otherwise they will reach this same pinch point</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Chander Velu</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.gettyimages.co.uk/detail/photo/smart-robot-in-manufacturing-industry-for-industry-royalty-free-image/1408271637?phrase=robot manufacturing&amp;amp;adppopup=true" target="_blank">kynny via Getty Images</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robots on a manufacturing line</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/social-media/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 03 Aug 2023 10:05:12 +0000 sc604 241131 at ֱ̽life robotic: Meet the Cambridge ֱ̽ researchers fostering human wellbeing using robots /stories/Cambridge-roboticists-wellbeing-support-robot-coaches <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽team is exploring the capacity robots have to inspire self-reflection, and support the work of psychologists and clinicians. </p> </p></div></div></div> Wed, 19 Jul 2023 10:10:48 +0000 sb726 240791 at