From the infrasonic pulses of the Earth to the ultrasonic waves vibrating through the air – humans are acoustically tuned out of a realm of conversations shared between such animal species as whales, elephants, bees and birds.
However, we are well on our way to being able to eavesdrop on or even partake in these conversations.
The Earth Species Project
The Earth Species project (ESP) is a non-profit organisation that is dedicated to ethically using artificial intelligence (AI), to decode non-human communication.
“I like to think of AI as similar to the invention of modern optics, which gave us the telescope, and helped us see that Earth was not at the center … pointing this new instrument out at the patterns of the planet will help us to see that humanity is not at the center,”
– Aza Raskin, co-founder of the Earth Species Project.
Inspired and informed by the field of bioacoustics (the study of sounds made by individual organisms) and ecoacoustics (the study of sounds made by entire ecosystems), ESP consists of an AI research team of around 10 people (as of 2022) with expertise in Mathematics, neuroscience and natural language processing, working in about 40 partnerships with biologists, and ecologists worldwide to process their collected data.
ESP has formed three key partnerships with leading behavioural ecologists, evolutionary biologists and bioacousticians with whom they collaborate on their research experiments.
One partnership is with Dr Christian Rutz, Professor of Biology at the University of St Andrews who is collaborating with ESP to analyse the vocal repertoires of crows – one being the Hawaiian crow – known for its tool-using ability and its endangered conservation status, in order to plan effective conservation and reintroduction strategies.
Further, last year, ESP published the first publicly available BEnchmark of ANimal Sounds (BEANS) that uses 10 data sets of bioacoustic recordings from a range of animal species, (including domesticated), thereby setting a baseline for machine learning classification and detection performance, that facilitates collaboration with ecologists and researchers committed to conservation efforts.
How Does it Work?
The fields of bioacoustics and ecoacoustics in their digitised forms collect the communicative sounds of species through digital recorders that are like tiny microphones, placed on the backs of animals such as whales and birds, or installed in their habitats and natural environments.
The devices are able to continuously record, undisturbed by human presence even in remote places like the deep seas and mountain peaks, unreachable to humans.
🤖 Portable sensors and artificial intelligence are helping researchers decode animal communication#AI #artificialintelligence #stemnews #sciencenews #animals #technology https://t.co/0mHfkLMzjy pic.twitter.com/pSRrVpPYDf
— Hannah Mason (@HannahSearchby1) March 15, 2023
Creating a “data deluge,” the AI comes in to analyse patterns in these recordings via natural-language-processing algorithms that work similarly to Google Translate.
2013 saw the breakthrough of a new algorithm that created a multi-dimensional geometric representation of an entire language through their words’ semantic relationship “like a galaxy where each star is a word and the distance and direction between stars encodes relational meaning.”
In 2017, a technique was then founded to achieve translation by aligning the shapes across different languages that would form the foundation for generative AI.
ESP drew inspiration from these language processing models, to build models that are working towards decoding nonhuman speech.
“You and I could never sing like a whale or buzz like a bee, but computers and biomimetic robots can. Our digital devices have brought us to the brink of a new era in digitally mediated interspecies communication,” writes Dr. Karen Bakker, Professor at the University of British Columbia, in her recently released book: “The Sounds of Life, How Digital Technology is Bringing Us Closer to the Worlds of Animals and Plants.”
Building a Rosetta Stone #AI system to understand/translate animal communication… @Aza Raskin at #FFLDN https://t.co/vZgwzbinx0 pic.twitter.com/o3TnAAiyvE
— Daniel Kraft, MD (@daniel_kraft) June 14, 2018
Whilst ESP’s applied AI technique hasn’t yet reached the sophistication of OpenAI’s ChatGPT in translating human speech, computer scientists have succeeded in designing an algorithm that has solved an issue known as the “cocktail party problem.”
Similarly to voice recognition software, the algorithm is able to isolate one singular sound from a recording of select species (such as bats and dolphins) from a chorus of speakers that creates an overwhelming amount of data. This takes a significant step forward to enabling zoological translation.
Are There Any Drawbacks?
Dr Bakker highlights the ethical question of boundaries and consent in the case that humans manage to establish reciprocal communication channels between the human and the nonhuman: “Maybe if they could talk to us, they would tell us to go away.”
Raskin also notes the potential risks of interruption, misinterpretation or violation of nonhuman communication and behaviour that he fears “could disrupt a 34-million-year-old-culture.”
Indeed, this could throw into question the current definition of personhood that in international law, is reserved for the human species.
Related Articles: Google vs. ChatGPT: Who Really Knows Best? | ‘I am a Machine, With no Soul or Heart’: An Interview With Artificial Intelligence | The Race Against AI | How Artificial Intelligence Could Help Advance One Health | Indigenous Peoples: Defending an Environment for All | Indigenous Languages and Ecocide: The Legacy of Western Colonialism Re-examined
On the other hand, the technologies may also be abused to weaponize and thereby abuse other species for human purposes, such as bees, who are already being used to test bio-detectors in antinarcotics, homeland security and demining operations in the United States.
The ESP is working towards a set of principles to ensure ethical research and to prioritise the conservation and wellbeing of other species.
Robert Seyfarth, Professor Emeritus of Psychology at the University of Pennsylvania also warns against the limitations of AI to decode animal communication, who although live in complex societies, have a smaller repertoire of sounds than humans.
He stresses the importance of context for each individual being, advocating for the increased practice of field work: “You’ve got to go out there and watch the animals.”
AI as Assistant – Not Replacement
A diverse expertise long practised by Indigenous Peoples across the globe, is Traditional Ecological Knowledge (TEK) that refers to the practices, knowledge systems and beliefs regarding relationships between species.
It rivals Western science in its accuracy, reliability and scope, although has been historically marginalised and continues to be undervalued.
#IntoTheWeedsDoc Clip: This powerful moment from the film highlights the Traditional Ecological Knowledge (TEK) Elders coming together to discuss the protection of our environment and communities.#PesticideFreeFuture @kathleenogrady @QUOImedia pic.twitter.com/JhBCQsSAw1
— Into the Weeds Doc (@intotheweedsdoc) May 2, 2023
For example, this year, a team of Yucuna women in the Colombian Amazon have been reported documenting the remaining oral knowledge from nine elders on bees, including their traditional classification system of diverse species, their names, characteristics, behaviours and beehive locations, who have been found to be better protected in Indigenous territory than other regions of Colombia.
Dr Robin Wall Kimmerer, Professor of Environmental and Forest Biology, renowned author of award-winning book, “Braiding Sweetgrass” and “Gathering Moss,” and enrolled member of the Citizen Potawatomi Nation, is well-versed in nonhuman communication, as an expert in both Western science and TEK:
“Mosses don’t speak our language, they don’t experience the world the way we do. So in order to learn from them I chose to adopt a different pace, an experiment that would take years, not months. To me, a good experiment is like a good conversation. Each listener creates an opening for the other’s story to be told. So, to learn about how Tetraphis makes reproductive choices, I tried to listen to its story … I was starting to think like a moss. With patient watching, and no direct questions, year by year, Tetraphis began to tell its own story.”
Indeed, human curiosity about the nonhuman perspective has forever been an ancient part of our culture – from Aesop’s fables to Dug the Dog’s translation collar in Pixar’s “Up.”
We construct entire mythologies in which humans are transformed into piscean, canine and arborescent hybrids, so powerful is our curiosity to know what other species might have to say to us.
It is this innate biophilic empathy we too often devalue that reminds us of our own animality and urges us to take back our natural place in the ecosystem alongside our nonhuman fellows.
A Technological Eden?
If committed to working in an equal and reciprocal dialogue with biological and ecological experts directly interacting with nonhuman species and communities, organisations such as ESP show promise in grounding the unstable potential of technology to the Earth’s end.
In a strange irony, the world of technology that has led us so far away from the natural world, may just be the key to unlocking our place back inside of it.
Editor’s Note: The opinions expressed here by the authors are their own, not those of Impakter.com — In the Featured Photo: Sheep mirroring one another, mouths open mid-bleat Featured Photo Credit: suju-foto