Can Machine Learning Help Us Build an Animal Translator?

Andrei Mihai

A female humpback whale swimming with her calf. Image credits: National Marine Sanctuaries / Wikipedia. CC BY 3.0

In 2023, humanity had its first contact with a non-human civilization; well, sort of. Scientists from the SETI Institute, University of California Davis, and the Alaska Whale Foundation teamed up to communicate with a humpback whale named Twain in Southeast Alaska.

The team used pre-recorded calls and were surprised to see Twain approach and circle their boat in a “greeting” behavior. Throughout a 20-minute window, the team broadcast the calls to Twain and matched the interval variations between each playback call. Essentially, they communicated in “humpback” tongue.

This is, of course, a rudimentary form of communication. The team demonstrated that whales can participate in turn-taking vocal exchanges and essentially broadcast some messages, interacting with a different species. Yet, although the general context was understandable, the scientists understood very little about what they were actually saying and what Twain was saying back.

This begs the question, are we able to properly communicate with animals?

The short answer is ‘not yet’, but thanks to advancements in machine learning, we are getting closer than ever to understanding other species – and in the process, improve our understanding of communication itself.

The Whale Alphabet

Spectrogram of humpback whale vocalizations. Image credits: Spyrogumas / Wikipedia. CC BY 3.0.

Whales have excellent levels of cooperation and social interaction, and they also emit vocalizations that can be picked up from hundreds or even thousands of miles away. This makes them excellent candidates for interspecies communication.

Sperm whales, for instance, are highly communicative creatures. They do not produce long, melodic calls like humpback whales, but they have a system of clicks called codas. Since 2005, researchers led by Shane Gero from the University of Ottawa have been following a pod of 400 sperm whales, recording their vocalizations and trying to decipher as much as possible about them.

A subset of the codas that sperm whales use have been shown to encode information about caller identity; in other words, the whales introduce themselves when talking. However, almost everything else is a message that whales are communicating to one another, and the contents of that message are challenging to decipher.

There are many features to communication that we rarely consider. The intensity and pitch of the sounds are obviously important but there are also more subtle elements, like rhythm and alternation, whether the tone is going up or down, and whether they exhibit any tonal ornamentation. All of this makes for a lot of data to understand and analyze.

In recently published research, Gero and his colleagues try to lay the basis for understanding the “whale alphabet.” They claim whales are much more expressive than previously thought and have a complex phonetic alphabet that is not very dissimilar to that of humans.

Jacob Andreas, one of the study’s co-authors, thinks a big data approach can be used for this purpose. He is training machine learning algorithms to recognize features of whale communication, and maybe even understand what the whales are saying.

The sheer volume and complexity of the data makes machine learning a suitable approach. Gero’s work has provided an extensive, high-quality dataset to work with (and he is far from the only biologist looking at whale communication). Using this, AI models will be able to work to identify and classify different types of whale clicks, distinguishing between echolocation clicks used for navigation and communication clicks used for social interaction.

The algorithms cluster codas based on their inter-click intervals, helping to identify distinct patterns and structures, representing the complex acoustic signals in high-dimensional feature spaces, which enables a more detailed analysis of the communication structures.

The results are incipient, but promising. They suggest that whale communication has a hierarchical structure, similar to sentences in human languages. The results also hint at the smallest meaning-carrying units and their combinations, something akin to human words. There is also a type of partial validation with this type of data.

To validate the AI-generated models, the researchers conducted playback experiments, where recorded whale sounds were played back to live whales to observe their responses. This approach helped test hypotheses about the meanings and functions of different codas and, while it is not perfect, it is still a useful way to test hypotheses and ground the AI-derived models in real-world whale behavior.

Does this mean we can talk to whales at the moment? Not really. But it does mean we are getting a more fundamental understanding at how some whales communicate, and we can maybe even pick up a few whale “words” and “sentences” here and there.

However, we are unsure whether whales communicate like us or whether their language is more like music. It is nonetheless possible that we soon start talking to whales, even as we lack a perfect understanding of what we are communicating.

Understanding Farm Animals

Left: Localized facial landmarks. Right: Normalized sheep face marked with feature bounding boxes. Image credits: University of Cambridge / Media Release.

From the vast expanse of the ocean, we now dive into the pens of farm animals. We kill and eat (or discard) around 80 billion land animals every year. While the livestock industry is not really interested in inter-species communication, it does care about information for practical decisions – and here too, AI can be of help.

In 2017, Peter Robinson, who was working on teaching computers to recognize emotions in human faces, had an idea: He thought he could do the same thing with sheep. Remarkably, the core of this approach can be traced back to Darwin, who argued that humans and animals show emotion through remarkably similar behaviors. In this spirit, Robinson and his colleagues thought there could be an opportunity to develop an AI that measures whether sheep are in pain.

“The interesting part is that you can see a clear analogy between these actions in the sheep’s faces and similar facial actions in humans when they are in pain – there is a similarity in terms of the muscles in their faces and in our faces,” said co-author Marwa Mahmoud in a press release from 2017.

They worked with a dataset of 500 photos of sheep taken by veterinarians as they were providing treatment. The veterinarians also estimated the pain levels of the sheep. After training, the model was able to estimate sheep pain with around 80% accuracy. More datasets can improve its performance even more.

For farmers, this could be important. Identifying discomfort or pain in sheep could direct an early intervention and farmers could provide them with early medical attention.

“I do a lot of walking in the countryside, and after working on this project, I now often find myself stopping to talk to the sheep and make sure they’re happy,” said Robinson.

Another approach presented in a different study also deals with sounds, specifically pig grunts. In this context, the goal was to discern between positive pig grunts (associated with joy or playfulness) and grunts indicative of negative emotions (like pain or being afraid).

Image credits: Kenneth Schipper Vera. CC BY 3.0.

They used a dataset of over 7,000 acoustic recordings gathered throughout the various life stages of 411 pigs, from their births to slaughter. After training the algorithm, the team claimed they were able to successfully attribute 92% of the calls to the correct emotion. The emotions of pigs were defined based on how they naturally react to various positive and negative external stimuli.

Other efforts are also taking an approach similar to the whale research. James Chen, an animal data sciences researcher and assistant professor at Virginia Tech, is building a database of cow vocalizations. Here too, the main focus is animal welfare, although Chen also believes he can spot which cows burp less (cows are an important source of methane emissions, and methane is a potent greenhouse gas).

Ultimately, communicating with farm animals is more about understanding when they are in trouble and less about two-way communication. AI is once more a great enabler in this area.

From Our Pets to the Animal Kingdom

Everyone who has ever had pets at some point must have wanted to communicate directly with them. You may or may not be happy to hear what your cat (or dog) thinks of you, but several teams are working on it. Most such research is taking a similar approach to the pig grunts: detecting the emotions behind the vocalizations. This emotion prediction approach is also used on cats.

Recently, there has been a great deal of progress and there are algorithms that can estimate pet emotions and see whether a bark is playful or aggressive (at least for some breeds), but we are not yet able to talk to pets directly.

For now, we are not able to truly talk to any other species. However, we may be witnessing the incipient steps in the quest for decoding animal communication, and AI is playing a key role.

Most such efforts, be it for whales, sheep, or cats, follow the same general approach: you start from a reliable database (usually of vocalizations), tag and classify the data, cluster it, and attempt to decode communication patterns and elements. There is an important caveat, however: AI has a strong human bias.

Bees communicate where food sources are through dances; birds have exquisite, elaborate songs and chirps; insects use chemical signals; whales “sing” or click. The animal world has a near-infinite diversity in communication, and our models make simplifying assumptions, as they are built on a structure for human communication. We (and our models) try to interpret everything through our lens, but this is not always accurate. Quite possibly, it is never accurate.

Several groups of researchers are attempting to decode animal communication through AI, but they face important, foundational challenges.

In a recent paper, researchers Yossi Yovel and Oded Rechavi from Tel Aviv University explored the potential for AI to understand non-human animal communication, drawing inspiration from the fictional “Doctor Dolittle” who could converse with animals. They argue that communication with animals involves far more than directly translating their sounds into human language, emphasizing the intricate context and multifaceted nature of such interactions.

They note three main obstacles in this quest. First, the context in which animals communicate is crucial. For example, while AI can replicate animal sounds, understanding the intent behind these sounds (such as whether a bird is singing to attract a mate or to signal territory) requires contextual information.

Second, eliciting natural responses from animals is fraught with challenges. Animals’ behaviors are influenced by various factors, including their physiological state and environmental conditions. Capturing authentic communication without conditioning the animals requires diverse observational techniques. Because of its biases and limitations, AI may misinterpret subtle animal behaviors as responses, leading to inaccurate conclusions. Finally, the limited range of contexts in which animals typically communicate, such as alarm signals or courtship behaviors, constrains the breadth of potential interspecies communication.

Some of these challenges can be overcome. The key is to not simply rely on AI algorithms to decode everything but to work interdisciplinarily, with specialists in the field potentially tagging contextual information and interpreting the findings in their correct context.

Ultimately, the dream of communicating directly with other species is still not yet real, but we have a plausible roadmap towards achieving it. At least in some cases, understanding and communicating with another species does not seem impossible. Which begs the question: If we really could talk to another species, what would we say?

The post Can Machine Learning Help Us Build an Animal Translator? originally appeared on the HLFF SciLogs blog.