Modal Title
Machine Learning

AI Can Analyze and Deconstruct Animal Behaviors Better than Humans

Machine learning cn predict animal behavior better than humans
Feb 21st, 2020 9:58am by
Featued image for: AI Can Analyze and Deconstruct Animal Behaviors Better than Humans

Propelled by curiosity and the necessity of survival, our earliest ancestors were captivated with animal behavior — observing various species as they ate, roamed, foraged and reproduced. This deep fascination has continued to modern times, as scientists probe the inner workings of the animal brain, with the intent of translating those lessons to help us better understand how the human brain works, and how intelligence develops.

Not surprisingly, recent advancements in artificial intelligence are assisting experts in the monumental task of not only observing how animals behave, but also in identifying and analyzing the sequence of typical micromovements that make up certain behaviors — many of them occurring too quickly for the human eye to catch. A team of neuroscientists from Germany’s Max Planck Institute of Neurobiology recently published their findings on using AI to help them deconstruct these kinetic sequences in the feeding activity of zebrafish larvae, with the aim of better understanding how the brain generates behavior.

Little Fish, ‘Multidimensional’ Behaviors

As a tiny tropical freshwater species, zebrafish (Danio rerio) are a popular choice for home aquariums, and also make an easy test subject for scientists looking to do medical research. As detailed in their paper, the neurobiologists focused on creating an unsupervised machine learning algorithm that analyzed and predicted the movements of miniature zebrafish larvae, particularly in how they capture prey.

Using high-speed cameras capable of snapping images in intervals of microseconds, the team zeroed in on the movements of the fishes’ eye, tail and jaw, while they swam in a small container. These movements offer clues as to what the fish is up to: for instance, the way zebrafish tails wave is continuously changing, in order to adapt to the location of their prey. The data from these images were assessed by the algorithm and allowed the team to identify and map similar behavioral sequences, which were grouped into seven distinct “modules.”

Most notably, the researchers were able to distinguish three major “modules” or stereotypical motion patterns as zebrafish attempt to seize upon prey: orientation, approach and capture. Even when played back in slow motion, the actions making up these discrete behavioral modules happened too quickly for human eyes to discern. But with their algorithm, the team was able to detect that zebrafish larvae must train both eyes on their target before initiating either one of three distinct “capture maneuvers”, depending on the relative distance between them. If the prey is farther away, the fish will choose to swim quickly toward the victim before gulping it down. Alternatively, if the target is closer, it will use strong sucking actions to bring in the kill instead. Interestingly, the AI allowed researchers to determine that the zebrafish larvae could deploy several different types of capture strategies by combining various stereotyped tail and jaw movements in a sequential chain.

“Artificial intelligence is key in organizing and classifying multidimensional data that our biological brains would be hard-pressed to process,” says Herwig Baier, director of the Max Planck Institute of Neurobiology and one of the co-authors of the study. “This applies equally to the massive loads of data generated by imaging, by gene expression analysis and by behavioral measurements.”

The use of AI has also helped the team examine how the larvae process stimuli during the prey capture sequence. For instance, the team replaced the prey with a virtual dot that was removed at various times. This prompted the zebrafish to terminate their capture maneuvers, no matter when the dot was removed during the sequence, indicating that the target has to be continuously in view for the zebrafish to continue its hunt.

“The big surprise was that hunting behavior was much less stereotyped than we had thought,” explained Baier. “The zebrafish larvae can abandon the hunt at any point and almost instantly, for example when the prey disappears during the pursuit. It’s as if the animal can change its mind on the fly. This is not what the classical ethologists reported and was only possible to discover with our virtual reality experiment, which gave us full control over the prey stimulus.”

Based on these results, the team now plans to transition to doing brain imaging and analysis of the neural circuitry, in order to get a better idea of which neurons are active in the zebrafish brain when it initiates its prey-capture sequence. As one might imagine, using AI to analyze zebrafish behaviors is but one step toward analyzing human behaviors and the neural architectures that underpin them in greater detail as well.

Read the paper here.

Images: Max Planck Institute of Neurobiology

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.