logo
#

Latest news with #ProjectCETI

What are animals saying? 3 AI tools that could soon tell us their thoughts
What are animals saying? 3 AI tools that could soon tell us their thoughts

Time of India

time6 days ago

  • Science
  • Time of India

What are animals saying? 3 AI tools that could soon tell us their thoughts

Efforts to decode animal communication using artificial intelligence are gaining momentum, with researchers worldwide working on projects that could one day allow humans to communicate directly with other species. From dolphins and whales to elephants and parrots, scientists are using advanced AI tools to uncover the complex ways animals convey meaning through sound. Decoding Dolphin Language with AI At the forefront of this research is Google DeepMind 's project DolphinGemma, which uses a large language model trained on decades of dolphin audio. Developed in collaboration with Georgia Tech and the Wild Dolphin Project, the tool is designed to break down dolphin vocalizations, segment the sounds, and process them similarly to how human languages are analyzed. According to Drew Purves, who leads nature-related AI projects at DeepMind, this approach allows scientists to examine dolphin communication at an unprecedented scale and depth. The goal is to not only understand how dolphins talk to each other but also to possibly recreate similar sounds and communicate back. The idea of interspecies conversation, once a far-off concept, is now being explored with tangible results. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like War Thunder - Register now for free and play against over 75 Million real Players War Thunder Play Now Undo Earth Species Project: Beyond Dolphins Another major initiative is the Earth Species Project, a nonprofit founded in 2017 that aims to decode the communication systems of non-human species using AI. Their flagship model, NatureLM-audio, is described as the first large-scale audio-language model built specifically for animal sounds. Through this, researchers have uncovered surprising findings—such as the fact that some animals, including elephants and parrots, seem to have individual names for one another. Co-founder Katie Zacarian emphasized that the objective is not domination or control, but rather a shift in how humans relate to the natural world. Instead of exploiting or subduing nature, the goal is to foster understanding and coexistence across species. Project CETI and the Whale Language Challenge Meanwhile, Project CETI (Cetacean Translation Initiative) is focused on the vocal patterns of sperm whales. These animals use 'codas'—brief, rapid clicks—in structured sequences, similar to syntax in human language. Using AI to interpret these codas, researchers have found signs of turn-taking in conversations and potentially even distinct dialects. CETI has isolated specific sounds that may act as punctuation marks in whale speech. They hope to have a rudimentary understanding of whale communication by 2026. This work draws parallels to the search for extraterrestrial intelligence, as both fields involve decoding unknown languages. In fact, SETI scientists were part of a team that recorded an acoustic exchange with a humpback whale named Twain, which involved back-and-forth calls over a 20-minute period. Limits and Implications of Interspecies Communication While AI has opened new doors, the limits of language go beyond sound. Many species use a combination of visual, chemical, and mechanical signals that humans do not perceive in the same way. For animals like dolphins, which rely on echolocation, sound is also a visual experience. German ecologist Jakob von Uexküll's concept of umwelt—an animal's unique perceptual world—illustrates how challenging true translation might be. This raises philosophical questions: if we could talk to animals, would they still be the same creatures? As theorist Stephen Budiansky once noted, understanding a lion through language might strip away what makes it a lion. Listening to the Living World Even without perfect translation, animals are already communicating their experiences—especially the impacts of human activity. Healthy ecosystems are full of natural sounds, while damaged ones fall silent. Noise pollution, largely from shipping and underwater mining, has steadily increased since the 1960s. Humpback whales, for instance, often stop singing when near commercial vessels, losing a vital tool for migration and mating. Their songs, which evolve over time and span oceans, demonstrate a different understanding of space and time. Speaking whale, then, may not just be about words—it could reshape how we think about our environment and ourselves. The promise of AI-facilitated interspecies communication is not merely a scientific curiosity. It could redefine humanity's place in the natural world, much like the realization that Earth is not the center of the universe. Whether through dolphins, whales, or parrots, these emerging tools may one day allow us to listen—and respond—in ways we never thought possible.

AI Is Deciphering Animal Speech. Should We Try to Talk Back?
AI Is Deciphering Animal Speech. Should We Try to Talk Back?

Gizmodo

time17-05-2025

  • Science
  • Gizmodo

AI Is Deciphering Animal Speech. Should We Try to Talk Back?

Chirps, trills, growls, howls, squawks. Animals converse in all kinds of ways, yet humankind has only scratched the surface of how they communicate with each other and the rest of the living world. Our species has trained some animals—and if you ask cats, animals have trained us, too—but we've yet to truly crack the code on interspecies communication. Increasingly, animal researchers are deploying artificial intelligence to accelerate our investigations of animal communication—both within species and between branches on the tree of life. As scientists chip away at the complex communication systems of animals, they move closer to understanding what creatures are saying—and maybe even how to talk back. But as we try to bridge the linguistic gap between humans and animals, some experts are raising valid concerns about whether such capabilities are appropriate—or whether we should even attempt to communicate with animals at all. Using AI to untangle animal language Towards the front of the pack—or should I say pod?—is Project CETI, which has used machine learning to analyze more than 8,000 sperm whale 'codas'—structured click patterns recorded by the Dominica Sperm Whale Project. Researchers uncovered contextual and combinatorial structures in the whales' clicks, naming features like 'rubato' and 'ornamentation' to describe how whales subtly adjust their vocalizations during conversation. These patterns helped the team create a kind of phonetic alphabet for the animals—an expressive, structured system that may not be language as we know it but reveals a level of complexity that researchers weren't previously aware of. Project CETI is also working on ethical guidelines for the technology, a critical goal given the risks of using AI to 'talk' to the animals. Meanwhile, Google and the Wild Dolphin Project recently introduced DolphinGemma, a large language model (LLM) trained on 40 years of dolphin vocalizations. Just as ChatGPT is an LLM for human inputs—taking visual information like research papers and images and producing responses to relevant queries—DolphinGemma intakes dolphin sound data and predicts what vocalization comes next. DolphinGemma can even generate dolphin-like audio, and the researchers' prototype two-way system, Cetacean Hearing Augmentation Telemetry (fittingly, CHAT), uses a smartphone-based interface that dolphins employ to request items like scarves or seagrass—potentially laying the groundwork for future interspecies dialogue. 'DolphinGemma is being used in the field this season to improve our real-time sound recognition in the CHAT system,' said Denise Herzing, founder and director of the Wild Dolphin Project, which spearheaded the development of DolphinGemma in collaboration with researchers at Google DeepMind, in an email to Gizmodo. 'This fall we will spend time ingesting known dolphin vocalizations and let Gemma show us any repeatable patterns they find,' such as vocalizations used in courtship and mother-calf discipline. In this way, Herzing added, the AI applications are two-fold: Researchers can use it both to explore dolphins' natural sounds and to better understand the animals' responses to human mimicking of dolphin sounds, which are synthetically produced by the AI CHAT system. Expanding the animal AI toolkit Outside the ocean, researchers are finding that human speech models can be repurposed to decode terrestrial animal signals, too. A University of Michigan-led team used Wav2Vec2—a speech recognition model trained on human voices—to identify dogs' emotions, genders, breeds, and even individual identities based on their barks. The pre-trained human model outperformed a version trained solely on dog data, suggesting that human language model architectures could be surprisingly effective in decoding animal communication. Of course, we need to consider the different levels of sophistication these AI models are targeting. Determining whether a dog's bark is aggressive or playful, or whether it's male or female—these are perhaps understandably easier for a model to determine than, say, the nuanced meaning encoded in sperm whale phonetics. Nevertheless, each study inches scientists closer to understanding how AI tools, as they currently exist, can be best applied to such an expansive field—and gives the AI a chance to train itself to become a more useful part of the researcher's toolkit. And even cats—often seen as aloof—appear to be more communicative than they let on. In a 2022 study out of Paris Nanterre University, cats showed clear signs of recognizing their owner's voice, but beyond that, the felines responded more intensely when spoken to directly in 'cat talk.' That suggests cats not only pay attention to what we say, but also how we say it—especially when it comes from someone they know. Earlier this month, a pair of cuttlefish researchers found evidence that the animals have a set of four 'waves,' or physical gestures, that they make to one another, as well as to human playback of cuttlefish waves. The group plans to apply an algorithm to categorize the types of waves, automatically track the creatures' movements, and understand the contexts in which the animals express themselves more rapidly. Private companies (such as Google) are also getting in on the act. Last week, China's largest search engine, Baidu, filed a patent with the country's IP administration proposing to translate animal (specifically cat) vocalizations into human language. The quick and dirty on the tech is that it would intake a trove of data from your kitty, and then use an AI model to analyze the data, determine the animal's emotional state, and output the apparent human language message your pet was trying to convey. A universal translator for animals? Together, these studies represent a major shift in how scientists are approaching animal communication. Rather than starting from scratch, research teams are building tools and models designed for humans—and making advances that would have taken much longer otherwise. The end goal could (read: could) be a kind of Rosetta Stone for the animal kingdom, powered by AI. 'We've gotten really good at analyzing human language just in the last five years, and we're beginning to perfect this practice of transferring models trained on one dataset and applying them to new data,' said Sara Keen, a behavioral ecologist and electrical engineer at the Earth Species Project, in a video call with Gizmodo. The Earth Species Project plans to launch its flagship audio-language model for animal sounds, NatureLM, this year, and a demo for NatureLM-audio is already live. With input data from across the tree of life—as well as human speech, environmental sounds, and even music detection—the model aims to become a converter of human speech into animal analogues. The model 'shows promising domain transfer from human speech to animal communication,' the project states, 'supporting our hypothesis that shared representations in AI can help decode animal languages.' 'A big part of our work really is trying to change the way people think about our place in the world,' Keen added. 'We're making cool discoveries about animal communication, but ultimately we're finding that other species are just as complicated and nuanced as we are. And that revelation is pretty exciting.' The ethical dilemma Indeed, researchers generally agree on the promise of AI-based tools for improving the collection and interpretation of animal communication data. But some feel that there's a breakdown in communication between that scholarly familiarity and the public's perception of how these tools can be applied. 'I think there's currently a lot of misunderstanding in the coverage of this topic—that somehow machine learning can create this contextual knowledge out of nothing. That so long as you have thousands of hours of audio recordings, somehow some magic machine learning black box can squeeze meaning out of that,' said Christian Rutz, an expert in animal behavior and cognition and founding president of International Bio-Logging Society, in a video call with Gizmodo. 'That's not going to happen.' 'Meaning comes through the contextual annotation and this is where I think it's really important for this field as a whole, in this period of excitement and enthusiasm, to not forget that this annotation comes from basic behavioral ecology and natural history expertise,' Rutz added. In other words, let's not put the horse before the cart, especially since the cart—in this case—is what's powering the horse. But with great power…you know the cliché. Essentially, how can humans develop and apply these technologies in a way that is both scientifically illuminating and minimizes harm or disruption to its animal subjects? Experts have put forward ethical standards and guardrails for using the technologies that prioritize the welfare of creatures as we get closer to—well, wherever the technology is going. As AI advances, conversations about animal rights will have to evolve. In the future, animals could become more active participants in those conversations—a notion that legal experts are exploring as a thought exercise, but one that could someday become reality. 'What we desperately need—apart from advancing the machine learning side—is to forge these meaningful collaborations between the machine learning experts and the animal behavior researchers,' Rutz said, 'because it's only when you put the two of us together that you stand a chance.' There's no shortage of communication data to feed into data-hungry AI models, from pitch-perfect prairie dog squeaks to snails' slimy trails (yes, really). But exactly how we make use of the information we glean from these new approaches requires thorough consideration of the ethics involved in 'speaking' with animals. A recent paper on the ethical concerns of using AI to communicate with whales outlined six major problem areas. These include privacy rights, cultural and emotional harm to whales, anthropomorphism, technological solutionism (an overreliance on technology to fix problems), gender bias, and limited effectiveness for actual whale conservation. That last issue is especially urgent, given how many whale populations are already under serious threat. It increasingly appears that we're on the brink of learning much more about the ways animals interact with one another—indeed, pulling back the curtain on their communication could also yield insights into how they learn, socialize, and act within their environments. But there are still significant challenges to overcome, such as asking ourselves how we use the powerful technologies currently in development.

Baidu in China working on AI that will let humans understand animals
Baidu in China working on AI that will let humans understand animals

India Today

time09-05-2025

  • Science
  • India Today

Baidu in China working on AI that will let humans understand animals

Have you ever wondered what your cat or dog is trying to say? Sounds ambitious, right? Well, Baidu is now working on something that might help with this. For those who don't know, Baidu is a Chinese tech company founded in 2000, and is known for running the country's biggest search engine. It wants to use AI to understand what animals are feeling or trying to say. The company recently filed a patent with the China National Intellectual Property Administration that describes a special system that could change animal sounds into human language. This system would use a mix of animal sounds, behaviour, and body signals to guess the animal's emotions and then turn those emotions into words we can to Baidu's patent, the system will first collect sounds made by animals, such as meows, barks or other vocalisations. It will also look at their behaviour, like how they move or act, along with body data like heart rate. All of this information will be processed together using AI to figure out what the animal might be feeling — like happiness, fear or hunger. Then, the system — in theory — would match these feelings with words or phrases in human language. This could allow people to talk with their pets in a whole new said in the patent that the system would allow 'deeper emotional communication and understanding between animals and humans, improving the accuracy and efficiency of cross-species communication.' When asked about when this product might be ready, a Baidu spokesperson said, 'There has been a lot of interest in the filing of our patent application. Currently, it is still in the research phase.'advertisementBaidu is not the only one working on this idea. Around the world, other scientists are also trying to use AI to study how animals communicate. For example, Project CETI (Cetacean Translation Initiative) is studying how sperm whales talk to each other using sounds. Another group called the Earth Species Project is also working to decode animal communication using technology. That project is supported by big names, including LinkedIn co-founder Reid news of Baidu's new animal translation patent came out, many folks on Chinese social media started talking. Some were excited, while others weren't so sure about the whole idea. A Weibo user wrote, 'While it sounds impressive, we'll need to see how it performs in real-world applications.'

China's Baidu looks to patent AI system to decipher animal sounds
China's Baidu looks to patent AI system to decipher animal sounds

The Hindu

time08-05-2025

  • Business
  • The Hindu

China's Baidu looks to patent AI system to decipher animal sounds

Ever wished you could understand what your cat is trying to tell you? A Chinese tech company is exploring whether it's possible to translate those mysterious meows into human language using artificial intelligence. Baidu, owner of China's largest search engine, has filed a patent with China National Intellectual Property Administration proposing a system to convert animal vocalisations into human language, according to a patent document published this week. Scientists have long attempted to decode animal communication, and Baidu's patent represents the latest effort to leverage AI to do so. The document says the system will collect animal data, including vocal sounds, behavioural patterns, and physiological signals, which will be preprocessed and merged before an AI-powered analysis designed to recognise the animal's emotional state. The emotional states would then be mapped to semantic meanings and translated into human language. The system could allow "deeper emotional communication and understanding between animals and humans, improving the accuracy and efficiency of cross-species communication," Baidu said in the patent document. "There has been a lot of interest in the filing of our patent application," a Baidu spokesperson said when asked how soon the company could turn the patent into a product. "Currently, it is still in the research phase." Baidu was among the first major Chinese companies to invest heavily in AI following the 2022 debut of OpenAI's ChatGPT. It unveiled its latest AI model, Ernie 4.5 Turbo, last month, saying it matched the industry's best in several benchmark tests. However, the Ernie chatbot has struggled to gain traction amid fierce competition. A number of efforts are underway outside China to try and interpret what animals want to convey. International researchers at Project CETI (Cetacean Translation Initiative) have been using statistical analysis and AI since 2020 to understand how sperm whales communicate, while the Earth Species Project, a non-profit founded in 2017 whose backers include LinkedIn's Reid Hoffman, is also trying to use AI to decode animal communication. Local media reports about Baidu's patent application sparked discussion on Chinese social media platforms late on Wednesday. While some were excited about the possibility of eventually being better able to understand their pets, others were sceptical. "While it sounds impressive, we'll need to see how it performs in real-world applications," commented a user on Weibo.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store