How long before AI can 'understand' animals?

Scientists are working on it, but it's a rough job.

Gamma-Rapho via Getty Images

The Regent Honeyeaters of Australasia are forgetting how to talk. The songbird’s habitat has been so severely devastated that its numbers are dwindling. Worse, the ones that remain are so scattered that the adult males are too far apart to teach the young how to sing for a mate — how to speak their own language. The gradual loss of the Honeyeaters’ song, their primary tool for wooing a partner, creates a vicious circle of spiraling decline.

Humans, on the other hand, cannot shut up. Estimates peg the total number of languages in use today to be around 7,000. In the US, roughly 25 percent of people claim they can converse in a second language. In Europe this number floats around 60 percent. In Asia or Africa, bilingualism is even more common as local tongues and regional dialects live alongside (often multiple) “official” languages. But not one person on this planet can speak Cat or Dog — much less Regent Honeyeater.

The Regent Honeyeater is one of Australia's numerous critically endangered animals and our version of the Passenger Pigeon. They were once found in vast flocks but have dwindled to around 100 birds (or fewer).
Henry Cook via Getty Images

Understanding animals is a tough nut to crack. For one, do animals even have a “language?” Even if they do, is there all that much to be said beyond the basics of survival? Probably not for most species, but as years of TV shows like Sabrina and films like Free Willy and basically anything Disney will attest, we really do wish we could natter with nature. The good news is that AI might grant us the ability to reliably translate animals in the next decade or so. The less good news is that it won’t be the Babelfish device you might be expecting.

“If you had to pick one component of humans ... that no other animal comes anywhere near being able to do anywhere near as well: Communication is the thing.” James Savage, a behavioral ecologist at both the University of Chester and Anglia Ruskin University told Engadget. In short, talking is what separates humans from the beast, so expecting animals to hold a conversation is somewhat oxymoronic.

If you’re now wondering about all those documentaries you saw with a dolphin talking to its keeper or a chimpanzee doing sign language, then you aren’t disproving this theory, you’ve merely identified the complexity of the question. Animals understanding our language appears to be obtainable to the degree of their cognitive ability. Going the other way, speaking Dolphin or Chimpanzee is a different kettle of (non-talking) fish.

The first problem is deciding what an animal language might look like. “One of the defining characteristics of human communication is that it's sequential. We have word tokens, words as it were. And they always occur in a certain sequence.” Jussi Karlgren, a computational linguist, told Engadget.

Much as we might hope, there’s little reason to suggest a pod of porpoises communicates in the same way we do. Not least because of the different vocal machinery, but also their environment, collective needs and, you know, the whole lack of being a human thing.

ac productions via Getty Images

You can’t blame us for thinking that way though. A long study into the calls of prairie dogs suggests that they can demonstrate something that resembles vocabulary. In one experiment, scientists approached the rodents at different times wearing different colored shirts and were able to determine discrete alarm calls for each one. The prairie dogs were basically saying “The woman in the blue shirt is back” or “this time it’s the yellow-shirted person.” Con Slobodchikoff, the lead researcher in this experiment and many more on prairie dogs, told The Atlantic as far back as 2013 that, in his opinion, his subjects had "the most sophisticated animal language that has been decoded."

While this seemingly opens up the tantalizing possibility of “vocabulary” in animal language, there’s also likely an evolutionary limit. An animal might be able to indicate something with a sound or “word” but only if it has a need to do so. Prairie dogs likely don’t bother to express things like how they feel or what their goals are in life. “The reason, as an animal, you communicate information to another animal, is if there's some benefit to you doing so.” Said Savage.

But what if there was an animal with few natural predators and high cognitive abilities. Say, a dolphin? According to Savage, there are hints they might have something more to talk about “I don't think it's too anthropomorphic to say that in the dolphin case, they have a particular little thing they do, which is their name, because they use it and other dolphins use it to refer to them.” Dolphins, it appears, give themselves names and respond to its use by others.

This phenomenon is something that piqued Karlgren’s imagination too. To the extent that he planned a detailed experiment that would feed dolphin calls into an artificial intelligence in the hope of deciphering them.

Thomas Barwick via Getty Images

The premise of using artificial intelligence feels like it should make sense. After all, AI has been shown to be quite effective at deciphering ancient human languages. So why should our mammalian water friends be any different? The answer comes back to the human tendency to think that the human way is the only way. Communication is more than just words; it can be tone, timing, context, facial expressions and more. Now transpose that to the dolphin world and… you can see why things get very complicated very fast. (What does dolphin sarcasm sound like?)

But Karlgren remains optimistic. “The hope is this: That if we collect a large corpus, a large collection of dolphin whistles and click trains, [we might be] able to segment them” And for that amount of data to yield results, AI really is our only hope.

Savage agrees. “Humans are usually pretty good at picking up acoustic differences in animal calls that they are familiar with,” he said, adding that “as artificial intelligence-based algorithms for classifying signals become more advanced, they will very rapidly get to the point where they can do that better than humans can.”

And the early signs are promising. In 2017, scientists were able to identify a number of different Marmoset calls with about 90-percent accuracy. In the same year, another team was able to identify when a sheep is in distress based on feeding an AI images of their facial expressions alone. Combining these two ideas would provide a more holistic understanding of what animals might be trying to say.

Cute Syrian hamster passionately shouting into a microphone, whilst gripping mic stand. Conceptual with space for copy.
Catherine Falls Commercial via Getty Images

Both Savage and Karlgren suggest that great steps can be made in the next ten years or so. Even if the result might not be the Google Translate for animals app we desire. There’s also the question of whether something like that is in anyone’s interest. “I think it almost cheapens animals to have that approach to them, where they have to interact with each other and others in the same way that we want to interact with them.” Savage said.

AI might well become a valuable tool in animal husbandry, either at a research or industrial level, but there are still important things we can do right now with the tools we currently have. Savage gave the example of Kakapo, a large, flightless parrot found in New Zealand. When it’s time to mate, Kakapo males dig a small pit and make a booming noise, using the hollow they made to amplify it. Females will pick their mate by the “quality” of that boom. But when you have a small population, this poses an issue, where only few males are successful and the gene pool becomes limited.

Savage explains how they were able to retire the best “boomers” and move them to another island where many juvenile males live. This allows the younger generation to learn from the successful males and become adept boomers themselves. As the young mature, they are then placed with the females and able to profit from their new found linguistic abilities. Slowly, the species can recover without risk of genetic limitations. Now if only we could tell the Regent Honeyeaters about this.

If artificial intelligence eventually delivers on its promise, maybe one day we can.