For anyone with a pet, you know that your ability to decode their needs is through daily observation. Over time, we become so in tune with our pets that a simple look in their eye or a distinct rhythm of tail wagging tells us precisely what they need. Scientists are changing the game with artificial intelligence (AI), fueling research to better understand other living species, from animals to trees. The goal? A deeper connection and a more informed two-way communication stream to help us protect them.
Wait…What the Tech?
The machine learning we’re talking about here is foundationally the same as the algorithms used to detect our human voices, the data that powers voice-assisted technology from our friends Siri and Alexa, and everywhere in between. It’s also the same concept as voice-assisted translation – software that converts speech into text so digital tools can translate between human languages.
Weight of Words
Beyond what animals say to each other, this area of study gives a growing insight into how animals feel. When looked at through this lens, factors like tone, sound length, and volume can also provide meaningful insight into animals’ emotions. The tech is already here – take a look at this AI-enabled dog collar, which analyzes five different emotional states of a dog — happy, anxious, angry, sad, or relaxed — and tracks its physical activity, to help dog owners better manage their dog’s entire lifespan.
Mirrors to the Mainstream
Researchers are finding more and more similarities between how animals communicate and how we communicate. Dr. Alison Barker, a neuroscientist at the Max Planck Institute for Brain Research in Germany, has used machine-learning algorithms to analyze 36,000 soft chirps recorded in seven mole rat colonies. They found that not only did each mole rat have its unique vocal signature, but each of the colonies had its distinct dialect; just like here in the human world, these dialects are passed down over generations. This, among other study insights, was summarized in a recent article by The New York Times.
Disney Dives In
Disney movies have long been on the pulse of new concepts – often introducing them covertly ahead of their time. In the Pixar movie Up, a cartoon dog wears a collar that translates his barks and whines into fluent, human speech. For many of us, other shows, such as Sabrina the Teenage Witch (hi, Salem!) and Free Willy, come to mind.
A leader in this topic of discovery is Earth Species Project (ESP), a non-profit dedicated to using artificial intelligence to decode non-human communication. The organization is not just researching and gathering information – but also creating tools – and they have been dubbed the “Google Translate for Animals” – you can read about their current projects here.
While getting to know our pets better certainly sounds exciting, the sky is the limit for how this knowledge can help us better care for all animal species and our planet. That’s, of course, if we are ready to listen to what they have to say.
Responses