The week of July 6th, 2024, I was in Alaska. One day my family and I decided to go see a “lecture” given by a naturalist. At the time, Alaska was very well known for the many whales that reside there: from Humpbacks to Orcas. This lecture was on the many types of whales that we had a chance to see in Alaska. I was listening to the lecture when I was amazed by the sentence, “Humans have been trying to use AI to decode whale language and echolocation signals.” This really stood out to me and I decided to do some research. This blog will go over the methods that are being used to decipher animal language. It will also introduce many subtopics such as things like the specific animals that are being tested or what prompted researchers to attempt to decode animal language.
Before we get into the actual research and findings of the researchers, let’s first check out why researchers wanted to decode animal language in the first place. Well, it all started with a gorilla named Koko who was taught to communicate using sign language. Although this was very fascinating, it created some controversy about whether Koko actually understood what she was saying (in sign language). This created a thought. Instead of trying to teach animals a language that we, humans, have created, why don’t we try and understand the method of communication between animals themselves. The term umwelt was used by professor, at the University of British Columbia, Karen Bakker’s book The Sounds of Life: How Digital Technology Is Bringing Us Closer to the Worlds of Animals and Plants. It was referred to as the “idea of the lived experience of an organism” or many organisms. Bakker thought that if we paid attention to the umwelt of another organism, we wouldn’t expect the organism to speak and understand our language, but rather, we would try to understand their languages and methods of communication.
Now the next thing we want to know is how the data is received and processed. Specific digital recorders are used to record animal communications. Now, these aren’t any regular digital recorders, they are special scientific recorders that can record sound waves anywhere from deep in the ocean to the top of mountains. The recorders are even sometimes attached to the animals themselves. This is where Artificial Intelligence (AI) comes in. The same AI that is used in translating software such as Google Translate is also used to analyze animal communication patterns. Although researchers and scientists haven’t completely deciphered animal languages, AI translation software is getting them closer and closer to their goal.
A simple example of this process and decipherment as a whole is research done by Yossi Yovel of Tel Aviv University. His team monitored approximately two dozen Egyptian fruit bats for two and a half months and collected the data. They were able to pick up certain sounds and determine different meanings of these sounds. One instance was when a video was recorded and the audio as well. It was a clip of two bats fighting over food, and the specific sound pattern was noted. Then, as the team heard more of the clips that were recorded, they noticed the same or similar patterns. They could then tell that this sound pattern indicates that two bats are fighting over food. The research team also noticed that each bat has its own name that the other bats would refer to it as. They were able to see this when a specific “word” was said by the bats, and then one specific bat would respond in a certain way.
Another example is that researchers have found simple ways to communicate with bees using patterns of vibrations. They used similar methods from the bat studies but here they detected different vibration patterns in the communication. They have perfected the algorithm to where they can now see where specific bees are going or where they currently are. They can determine what patterns mean, things like “stop” or “danger” signals. A researcher named Tim Landgraf, of Freie Universität Berlin is trying to create a robot that looks and functions exactly like a bee. His goal is to one day be able to fly the robot into a beehive and, using the research, communicate with the bees and tell them different commands.
It is fascinating, to me, how we humans have evolved so much to where we can actually understand other animals and the ways that they communicate. When I first heard, in Alaska, about the efforts being made to decode animal language, I was definitely intrigued. But I didn’t know the extent to which researchers have gone to be able to understand these animals. I also didn’t know that it was algorithms like Google Translate that were actually processing the data collected by researchers. Overall, my curiosity led me to explore more about what scientists are doing to be able to process animal language. This exploration and research that I did has broadened my view on software like Google Translate and really just the topic of language as a whole. I feel that if right now we are finding ways to talk to bees, what is out there in our futures as humans. It seems that science will just have to pave that path.
Sources
Bushwick, Sophie. “How Scientists Are Using AI to Talk to Animals.” Scientific American, February 20, 2024. https://www.scientificamerican.com/article/how-scientists-are-using-ai-to-talk-to-animals/.
KOSTOV, Y., and V. ALEXANDROVA. “HOW TO RECOGNIZE (DECODE) ANIMAL LANGUAGE.” Journal-article. Bulgarian Journal of Agricultural Science. Vol. 15, 2009. https://www.agrojournal.org/15/05-14-09.pdf.
Newcomb, Tim. “Humans May Be Shockingly Close to Decoding the Language of Animals.” Popular Mechanics, February 21, 2024. https://www.popularmechanics.com/science/animals/a42689511/humans-could-decode-animal-language/.
Image Credit: National Geographic
Main, Douglas. “Why Koko the Gorilla Mattered.” Animals, June 21, 2018. https://www.nationalgeographic.com/animals/article/gorillas-koko-sign-language-culture-animals.