By Tom Grrken, Technology reporter
Researchers from the University of Michigan are using artificial intelligence (AI) to better understand what a dog’s bark conveys about whether it is feeling playful or angry.
They are also digging into whether AI could correctly identify a dog’s age, gender and breed based on what it woofs.
The scientists were able to make progress towards decoding canine communication by repurposing existing computer models trained on human speech.
“Advances in AI can be used to revolutionize our understanding of animal communication,” said University of Michigan AI Laboratory head Rada Mihalcea.
“Our research opens a new window into how we can leverage what we built so far in speech processing to start understanding the nuances of dog barks.”
AI has enabled great strides to be made in understanding the subtleties of speech.
AI-powered systems are used to distinguish nuances in tone, pitch and accent, which in turn enables technologies such as voice-recognition software.
They have reached that level of sophistication by being trained on a huge number of real human voices.
However, no comparable database exists for dogs.
“Animal vocalizations are logistically much harder to solicit and record,” pointed out Artem Abzaliev, the study’s lead author.
His team set out to discover whether scientists could get round that lack of data by piggy-backing on research carried out on humans.
So his team gathered the barks, growls and whimpers of 74 dogs of varying breeds, ages and sexes, in a variety of contexts.
They the fed them into a machine-learning model – a type of algorithm that identifies patterns in large data sets – which had been designed to analyse human speech.
And they found it also did a good job at cocking at ear at what dogs were communicating too.
On average, the researchers found their model was 70% accurate across various tests.
“This is the first time that techniques optimized for human speech have been built upon to help with the decoding of animal communication,” said Ms Mihalcea.
“Our results show that the sounds and patterns derived from human speech can serve as a foundation for analyzing and understanding the acoustic patterns of other sounds, such as animal vocalizations.”
The researchers say their findings could have “important implications” for animal welfare.
They suggest better understanding the nuances of the various noises animals make could improve how humans interpret and respond to their emotional and physical needs.
The results were presented at the Joint International Conference on Computational Linguistics, Language Resources and Evaluation.
Mexico’s National Institute of Astrophysics, Optics and Electronics Institute also worked with the University of Michigan on the project.