Tube4vids logo

Your daily adult tube feed all in one place!

Cat got your tongue? How AI could is on cusp of breakthrough that'd allow people and ANIMALS to talk to each other in '12 to 36 months'

PUBLISHED
UPDATED
VIEWS

It sounds like the plot of a new Disney movie, but experts predict AI will allow people to communicate with household pets and even wild animals.

Researchers around the world are using 'digital bioacoustics' - tiny, portable, digital recorders - to capture the sounds, tics and behaviors of animals that are too quiet or nuanced for humans to pick up on.

These databases will be used train artificial intelligence to decipher these miniature communications and translate them into something more comprehendible to us, almost like a 'ChatGPT for animals'.

Projects such as the Earth Species Project expect a breakthrough in the next 12 to 36 months.

One researcher hopes to be able to unravel the language of dogs

One researcher hopes to be able to unravel the language of dogs

Founded in 2017, the AI non-profit aims to record, understand and ‘talk back’ to animals - from cats and dogs to more unusual species such as whales and crows.

Current Earth Species Project experiments include attempts to map the vocal repertoires of crows - and another experiment which aims to generate new vocalizations which birds can understand.

Aza Raskin, one of the cofounders of the Earth Species Project believes that generating animal vocalizations could be as little as a year away.

Raskin told Google: 'Can we do generative, novel animal vocalizations? We think that, in the next 12 to 36 months, we will likely be able to do this for animal communication.

'You could imagine if we could build a synthetic whale or crow that speaks whale or crow in a way that they can't tell that they are not speaking to one of their own.

'The plot twist is that we may be able to engage in conversation before we understand what we are saying.'

Below are some of the other projects aimed at achieving intelligible communication between people and animals:  

Could AI help us to understand what cats are saying? (getty)

Could AI help us to understand what cats are saying? (getty) 

Cat got your tongue? 

Artificial intelligence could finally unravel a mystery which has dogged the human race for centuries - what are cats actually thinking?

Researchers at the University of Lincoln are using AI to categorize and understand the expressions of cats.

Professor Daniel Mills said: 'We could use AI to teach us a lot about what animals are trying to say to us.'

AI can learn to identify features such as the ear position of cats, which could help to classify and understand the hundreds of expressions cats use to communicate.

Similarly, a new AI model aims to translate the facial expressions and barks of dogs.

Its creator, Con Slobodchikoff, author of Chasing Doctor Dolittle: Learning the Language of Animals, told Scientific American that when we understand animals, it may reveal surprising facts.

'Animals have thoughts, hopes, maybe dreams of their own.'

The bat man

Bats have far more complex language than people thought - they have names, they argue over food, and mother bats use 'baby language' when talking to children.

That's the conclusion of a pioneering AI study which used a voice-recognition program to analyze 15,000 bat calls, with an algorithm correlating the sounds to videos of what the bats were doing.

Yossi Yovel of Tel Aviv University told the BBC: 'I've always dreamed of a Doolittle machine that will allow me to talk with animals. Specifically, I'm interested in that vocal communication.

'With teaching the computer how to define between the different sounds and how to recognize what each sound means when you can hear it. We teach the AI how to differentiate between the different sounds.'

'In the end, the computer will be able to speak the language to understand what they say to each other.'

Researchers now know that bats 'argue' over food, and that baby bats repeat what their mother 'says' in order to learn language.

'Deep learning' is able to decipher bat language (which is largely ultrasonic, and much faster than human speech) - humans can't listen to it, but computers can.

Yovel remains skeptical that a 'decoder' which can instantly translate bats will arrive 'in his lifetime', but is now aiming to understand longer-term social interactions of bats.

Clicking with whales

Microphones on buoys and robotic fish are attempting to unravel one of the most famous 'voices' of the animal kingdom - whale song.

Sperm whales are the world's largest predators and locate their food using clicks - but also use shorter series of clicks called 'codas' to communicate with each other.

The Project CETI team are planting microphones on whales to capture huge amounts of data, with a goal of using machine learning to unravel what the vast animals are saying.

One project aims to unravel the clicking 'codas' of sperm whales (Getty)

One project aims to unravel the clicking 'codas' of sperm whales (Getty)

To attach the microphones, the team uses a thirty foot pole.

The AI team is already capable of predicting whale codas (sequence of clicks) with up to 95 percent accuracy, and now hopes to increase the volume of data to further establish patterns and work out what they are saying.

Comments