Inside the Museum of Zoology at Cambridge seeing loads of specimen and skeletons on various levels of the museum

December 13, 2024, by Brigitte Nerlich

Chatting with a cockroach – a bonus post

Last week I posted my overview of the blog posts I have written over the year, and I thought that I was done with blogging for the rest of the year. Little did I know. Given that we now hear about AI speaking or learning the language of atoms, I had to do a quick post on that. And today, I can’t resist reminiscing about an encounter I had last week in Cambridge. As part of a few days in Cambridge, we went to the Museum of Zoology. Fortunately, I had been before because we didn’t see many of the exhibits! Not to worry. We amused ourselves otherwise.

Bumping into chatbots (and some tech problems)

We were just standing around looking down into the well of the museum (see featured image) when we were accosted by a nice lady who asked us whether we wanted to have a chat with some of the exhibits through the medium of artificial intelligence (I don’t think she said it quite like that, but that’s the gist of it).

I then remembered having read about an experiment the museum was doing with visitors where they invite them to chat with specimens. I had totally forgotten about that, but was curious to try this out.

First, we had to scan a QR code near the exhibit. So far, so good. Then I got stuck. The problem was that one needed to get wifi access to get to a certain website and despite trying everything, my stupid phone didn’t want to cooperate. Ha, technology! It had worked fine everywhere else. So, we tried my husband’s phone and that worked for some reason. That all took a while, and I only hope that other people have better luck than me.

Chatting with a cockroach and a sea turtle

We were standing right in front of an American cockroach by then. So, we pointed the phone at its QR code, and I asked it the rather provocative question ‘Do you think you are ugly?’ It replied in a deep male voice and told me that beauty is in the eye of the beholder, that its inner beauty lies in its resilience and more. I felt suitably chastened.

Wandering about a bit, we found ourselves in front of the skeleton of giant leatherback sea turtle, the largest turtle in the world. Again, we pointedSkeleton of a giant sea turtle our phone at the QR code etc. and I asked, “Do you find that your carapace is a burden to you?”. A refined female voice explained that the carapace was not a burden at all, as it “was perfectly suited to my migratory lifestyle. It wasn’t as heavy as it might seem – every ridge and curve was designed for a life of endless travel across the see”.

Not chatting with the dodo

Skeleton of a DodoIn all the technical kerfuffle we forgot to actually talk to the dodo!!! But here is a YouTube video of Jack Ashby, Assistant Director of the museum, talking briefly to it as well as to a platypus. When asked what it’s like to be a platypus it went very mystical and deep and said things like: “It’s like dancing to nature’s most eclectic symphony”. The dodo is equally poetic, although it had a more down-to-earth accent. Listen here to how it responds to Jack asking it how it is.

How was it done?

As The Guardian reported, “the project was devised by Nature Perspectives, a company that is building AI models to help strengthen the connection between people and the natural world. For each exhibit, the AI is fed specific details on where the specimen lived, its natural environment, and how it arrived in the collection, alongside all the available information on the species it represents. The exhibits change their tone and language to suit the age of the person they are talking to, and can converse in more than 20 languages, including Spanish and Japanese. The platypus has an Australian twang, the red panda is subtly Himalayan, and the mallard sounds like a Brit. Through live conversations with the exhibits, Ashby hopes visitors will learn more than can fit on the labels that accompany the specimens.”

Museum curation and AI

The data from the conversations will be analysed and, as the same Guardian article points out: “the team hopes that the month-long experiment will help them learn more about how AI can help the public to better engage with nature, and about the potential for AI in museums. It will also provide the museum with new insights into what visitors really want to know about the specimens on display”. (I have just filled in the post-visit questionnaire)

But what about ‘hallucinations’, or rather confabulations, I wondered…  and I asked myself, like the BBC , “how will the company [Nature Perspectives] ensure the AI responds correctly and without making up replies? Mr [Gal] Zanir said the AI was ‘fine-tuned’ on a curated set of scientific data selected by its team of ecology experts. While the AI will draw on broader available knowledge, it will prioritise the specific knowledge that had been ‘carefully curated’”

So, museum ‘curation’ has taken an AI turn, like so many other things.

Right time right place

This is possibly the first time a museum has used generative AI (for a limited period) to allow visitors to chat with specimen and we were there in the right place at the right time. I would have loved to play a bit more with the bots and try out various things, for example, interacting with the cockroach as a child rather than an adult, in German rather than English etc. But we needed to be elsewhere for lunch and so off we went…forgetting the dodo.

Images by me apart from the Dodo which is by Emőke Dénes, Wikimedia Commons.

 

 

 

Posted in artifical intelligence