Will AI soon be able to help us translate what animals say to each other? Researchers are working there

Will AI soon be able to help us translate what animals say to each other? Researchers are working there
Will AI soon be able to help us translate what animals say to each other? Researchers are working there

What do animals say to each other when they call each other? This eternal mystery will perhaps eventually be resolved, especially if tools linked to artificial intelligence are used. This will likely be the case in 2025, explains Wired, which shows that recent advances in AI and major linguistic models (the famous LLMs, on which chatbots like ChatGPT are based) will make it possible to make significant advances.

Many research groups have been working for years on algorithms to make sense of animal sounds. This is for example the case of the Ceti project, which deciphered the click sequences of sperm whales and the songs of humpback whales. These modern machine learning tools require extremely large volumes of data, which has been lacking until now – but things could change soon.

Subscribe for free to the korii newsletter!Don’t miss any korii articles thanks to this daily selection, directly in your inbox.

If it is easier to make AI work on human language, it is firstly because we have a substantial, if not unlimited, database. For example, the Ceti project was only able to use 8,000 sounds to analyze and decipher the communication of sperm whales, which is completely minimal.

Decipher or translate

Furthermore, when scientists work on the basis of human language, they already know what is being said, just as they are aware of what constitutes a word or phrase. This represents a huge advantage over the interpretation of animal communication: for example, science struggles to determine whether the howl of a particular wolf means something different from the howl of another wolf, or whether it exists analogies between howls – as there are between our words.

2025 should allow progress on the subject. It must be said that automated recording of animal sounds is now within the reach of all scientific research groups, with low-cost recording devices (such as AudioMoth) now available and within everyone’s reach. It is possible to leave some in the field to capture the cries of many animals 7 days a week, 24 hours a day.

Processing these sounds, which was once a challenge, will also be simpler: new automatic detection algorithms are now capable of going through thousands of hours of recording and classifying the different recorded cries according to their natural acoustic characteristics. This then makes it possible to use powerful analytical algorithms, which are based on the use of deep neural networks in order to find a hidden structure in vocalization sequences.

But before we go any further, we need to determine precisely what we hope to do with these animal sounds and what information we can derive from them in the future. Some organizations, like Interspecies.io, have clearly defined their goal as “transform signals from one species into coherent signals for another” –in other words, translating animal communication into human language.

Other structures and institutions are a little less ambitious, not talking about “translating” but simply “deciphering”. Most scientists agree that non-human animals do not have language of their own, at least not in the same way that humans do. To date, we don’t know exactly how much information animals transmit to each other. And that’s an important subtlety.

-

-

PREV All these French people have just released Android for iOS
NEXT “A wake-up call for our planet”: new images show the frightening scale of Arctic ice retreat