ֱ̽ of Cambridge - Nature Perspectives /taxonomy/external-affiliations/nature-perspectives en Public invited to chat to museum animals in novel AI experiment /research/news/public-invited-to-chat-to-museum-animals-in-novel-ai-experiment <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/fin-whale-1-credit-j-garget885x428px.jpg?itok=wDJcjckr" alt="Jack Ashby talking to the Museum&#039;s fin whale " title="Jack Ashby talking to the Museum&amp;#039;s fin whale , Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>From Tuesday 15 October, the ֱ̽ of Cambridge’s Museum of Zoology is offering visitors a unique experience: the chance to chat with the animals on display – whether skeletal, taxidermy, or extinct.</p> <p>In a collaboration with the company Nature Perspectives, the Museum’s Assistant Director Jack Ashby has chosen a range of animal specimens to bring back to life using generative artificial intelligence (AI).</p> <p>Visitors can pose their questions to 13 specimens - including dodo and whale skeletons, a taxidermied red panda, and a preserved cockroach - by scanning QR codes that open a chat-box on their mobile phone. In two-way conversations, which can be voice- or text-based, the specimens will answer as if they are still alive.</p> <p>This is believed to be the first time a museum has used generative artificial intelligence to enable visitors to chat with objects on display in this way.</p> <p>By analysing data from the conversations, the team hopes that the month-long experiment will help them learn more about how AI can help the public to better engage with nature, and about the potential for AI in museums. It will also provide the museum with new insights into what visitors really want to know about the specimens on display.</p> <p>Nature Perspectives uses AI to enable cultural institutions like the Museum of Zoology to engage the public through these unique conversational experiences. ֱ̽company aims to reverse a growing apathy towards biodiversity loss by enabling new ways to engage with the natural world.</p> <p>“This is an amazing opportunity for people to test out an emerging technology in our inspiring Museum setting, and we also hope to learn something about how our visitors see the animals on display,” said Jack Ashby, Assistant Director of the ֱ̽ of Cambridge’s Museum of Zoology.</p> <p>He added: “Our whole purpose is to get people engaged with the natural world. So we're curious to see whether this will work, and whether chatting to the animals will change people’s attitudes towards them - will the cockroach be better liked, for example, as a result of having its voice heard?”</p> <p>“By using AI to simulate non-human perspectives, our technology offers a novel way for audiences to connect with the natural world,” said Gal Zanir, co-founder of the company Nature Perspectives, which developed the AI technology for the experience.</p> <p>He added: “One of the most magical aspects of the simulations is that they’re age-adaptive. For the first time, visitors of all ages will be able to ask the specimens anything they like.”</p> <p> ֱ̽technology brings together all available information on each animal involved – including details particular to the individual specimens such as where they came from and how they were prepared for display in the museum. This is all repackaged from a first-person perspective, so that visitors can experience realistic, meaningful conversations.</p> <p> ֱ̽animals will adjust their tone and language to suit the age of the person they’re talking to. And they’re multi-lingual - speaking over 20 languages including Spanish and Japanese so that visitors can chat in their native languages.</p> <p> ֱ̽team has chosen a range of specimens that include skeletons, taxidermy, models, and whole preserved animals. ֱ̽specimens are: dodo skeleton, narwhal skeleton, brain coral, red admiral butterfly, fin whale skeleton, American cockroach, huia taxidermy (a recently extinct bird from New Zealand), red panda taxidermy, freeze-dried platypus, giant sloth fossil skeleton, giant deer skull and antlers, mallard taxidermy, and <em>Ichthyostega</em> model (an extinct ancestor of all animals with 4 legs).</p> <p>Nature Perspectives was created by a team of graduates from the ֱ̽’s <a href="https://www.cl.geog.cam.ac.uk/">Masters in Conservation Leadership</a> programme, who noticed that people seem to feel more connected to machines when they can talk to them. This inspired the team to apply the same principle to nature - giving nature a voice to promote its agency and foster deeper, more personal connections between people and the natural world.</p> <p>“Artificial intelligence is opening up exciting new opportunities to connect people with non-human life, but the impacts need to be carefully studied. I’m delighted to be involved in exploring how the Nature Perspectives pilot affects the way people feel about and understand the species they ‘meet’ in the Museum of Zoology,” said Professor Chris Sandbrook, Director of the ֱ̽ of Cambridge’s Masters in Conservation Leadership programme.</p> <p>“Enabling museums to engage visitors with the simulated perspectives of exhibits is only the first step for Nature Perspectives. We aim to apply this transformative approach widely, from public engagement and education to scientific research, to representing nature in legal processes, policy-making and beyond," said Zanir.</p> <p> ֱ̽Nature Perspectives AI experiment runs until the end of Dcember 2025.</p> <p> </p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Specimens in a Cambridge museum will be brought to life through the power of Artificial Intelligence, by a team aiming to strengthen our connection with the natural world and reverse apathy towards biodiversity loss.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">This is an amazing opportunity for people to test out an emerging technology in our inspiring Museum setting.</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Jack Ashby</div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Jack Ashby talking to the Museum&#039;s fin whale </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Sun, 13 Oct 2024 23:14:04 +0000 jg533 248221 at