Complex Made Simple

If animals could speak? Well they are, and soon apps will translate every word they say

A 2017 Amazon-sponsored report on future trends predicted that in just 10 years, we’ll have a translator for pets. Not sure many want to find out the secrets these animals know, but there is an ongoing project on just that

A company is now developing a mobile app designed to translate dog body language and sounds into English Much research supports the idea that marine mammals have highly-developed brains capable of complex cognition Researchers have devised systems of symbols that animals can use by touching or pointing

Natural language processing (NLP) technologies can enable a machine to understand human language, but what about animal language? 

To start, researchers are building deep learning models, training them on an animal language database in which various voice signals are linked with expressions of emotions and sensory feelings.  

A 2017 Amazon-sponsored report on future trends predicted that in just 10 years, we’ll have a translator for pets. Not sure many want to find out the secrets these animals know, but there is an ongoing project on just that.

We look to be well on the way to either talking to animals or understanding them.

Human-animal communication

The idea of tech-based animal language translation has become more serious, as we’ve discovered parallels between birdsong and human speech and developed dolphin whistle decoders. Now, artificial intelligence is bringing exciting new power and potential to the topic of human-animal communication. 

A biologist, Con Slobodchikof, discovered detailed messages in the prairie dog sounds he recorded and studied for decades. With the goal of improving the relationship between dogs and their owners, Zoolingua, the company he founded, is now developing a mobile app designed to translate dog body language and sounds into English. The team is collecting data on dog body language, facial expressions, and vocalizations for model training. 

Zoolingua says the tech could be extended to other pets to improve communication and human understanding: “Where we’ve only seen behavior issues, we might hear and understand their fear, pain, and needs.”

In fact, today’s machine-learning systems analyze data and look for correlations with startling efficiency; often, they find statistical connections that human analysts miss. 

Researchers at Google have developed an AI system that can translate from an “image map” to a language map. Given enough training data, these A.I. algorithms can extract semantic meaning from a range of non-linguistic inputs.

Britt Selvitelle, a computer scientist who worked on the team that created Twitter, is a founding member of the Earth Species Project, an organization developing AI approaches like this to animal communication. 

“We’re working on decoding the first nonhuman language,” he said, a goal that he thinks can be reached in five to ten years. 

The loose correspondences between human and animal words and concepts may not matter to an AI; neither will the fact that animal ideas may be expressed not as vocalizations but as gestures, sequences of movements, or changes in skin texture. 

Read: Neuralink 101: Elon Musk shows off his machine-brain interface with a live-animal demonstration

Read: Scientists, businesses, and people are cloning wild animals, cattle, and pets. Has this gone too far?

Whales

Since the 1960s scientists have been detailing how whales use a variety of noises to identify objects, survey their surroundings, and communicate with each other. Thanks to advances in machine learning and artificial intelligence, we’re now beginning to pull back the curtain on that communication in truly exciting ways. 

In 2011, Zooniverse, a platform for conducting scientific research collected over 15,000 samples of pilot and orca (killer) whale calls from off the coasts of the Bahamas, Iceland, and Norway to see whether computer analysis, using artificial intelligence (AI Shamir), could decipher anything about these populations and the ways they use their sonic repertoire. 

It turns out the same species might have a different dialect based on where they live, just like people have different accents.

AI Shamir uncovered evidence for different communication styles or dialects within each species.  

In April of this year, a team of experts in robotics, linguistics, and machine learning established Project CETI, a scientific endeavor with the goal of applying new advances in machine learning to better understand sperm whale language. 

Sperm whales “talk” to one another through intensely loud series of clicks, called codas. The clicks are so loud they can reach upwards of 230 decibels, louder than a rocket launch. This makes them the loudest animal on the planet and enables them to communicate with one another over distances of hundreds of miles. 

Much research supports the idea that cetaceans (marine mammals) have highly-developed brains capable of complex cognition. Sperm whale brains also contain specialized neurons called spindle cells, which, according to NewScientist, are also found in human brains in the regions responsible for empathy, social organization, and speech. 

CETI is building new audio and video equipment to record sperm whale calls by the millions to get as complete a sonic picture as possible. 

Chickens

Understanding animal language and improving animal-human communication is also appealing for farmers. Over the past five years, researchers from Georgia Institute of Technology have been collecting sound information from chickens.

 The recordings were made when the chickens were exposed to certain conditions, e.g. heat or cold, light or dark. The collected sounds were then used to train a machine learning model to identify the difference between contented and distressed birds. The model has proven capable of detecting “emotional” changes in chickens with near perfect accuracy. The study’s finding can be used to improve poultry conditions and productivity.

Elephants 

Silicon Valley based Conservation Metrics has been working with researchers in Africa to apply new AI techniques to wild elephant protection. The project has collected 900,000 hours of recordings with elephant vocalizations in the Central African Forest. Researchers were able to identify the sounds for greetings and other daily communications in a particular elephant herd, and, most importantly, the sound the elephants make when poachers (elephant hunters) are spotted. The project plays an important role in anti-poaching and other conservation efforts.

Gorillas and monkeys

Koko, a gorilla who lived in a preserve in the Santa Cruz Mountains until her death in 2018, learned many words in a modified version of American Sign Language.

A bonobo or a pygmy chimpanzee named Kanzi was studied by the primatologist Sue Savage-Rumbaugh beginning in the 1980s, and found to be able to understand complex human commands and communicate using keyboard symbols. 

These animals didn’t just ask for this or that object, but they also learned to convey sadness, for instance, through hand gestures mimicking the flowing of tears.

Researchers have devised systems of symbols that animals can use by touching or pointing. Underwater keyboard for dolphins allowed the animals to quickly figured out, without instruction, how to request a body rub or a ball. 

Between 2016 and 2019, at the National Aquarium in Baltimore, studies used an eight-foot underwater touch screen fitted with dolphin-friendly interactive apps, including a version of Whac-A-Mole in which fish move across the display.