AI Analyzes Dolphin Chatter And Discovers Something We Didn’t Know

In each culture, we’re trained to hear in a certain way. Speakers of tonal languages such as Chinese recognize subtleties of pitch that are difficult for English-speaking people to grasp. A similar thing happens with Indian music, which uses pitches between notes with which Western listeners are familiar. Similar biases make it difficult for human researchers to discern the subtleties of animal sounds that may sound alike to us, but maybe not to them. AI, though, is very good at identifying patterns, and researchers have set it the task of analyzing sounds from the ocean floor. It’s just found six distinct dolphin clicks we never new existed, according to a study just published in PLOS Computation Biology.

Previous examinations of dolphin sounds have been done by humans, either listening to recordings from boats tracking the cetaceans or captured by hydrophones — underwater microphones — near or on the ocean floor. It’s believed that dolphins use clicks for the echolocation with which they navigate, socialize, and find food.

(nUSANTARa)

Scientists have been using the clicks to differentiate between different species and different types of dolphins, and to track their subsurface perambulations. But they do sound alike to us. As biologist Simone Baumann-Pickering, who wasn’t involved in the new study, tells Science News, “When you have analysts manually going through a dataset, then there’s a lot of bias introduced just from the human perception. Person A may see things differently than person B.”

A team of oceanographers led by Kaitlin E. Frasier of the Scripps Institution of Oceanography in La Jolla, Calif used an AI algorithm to go through 52 million clicks recorded across the Gulf of Mexico over a period of two years. “We don’t tell it how many click types to find,” Frasier tells Science News. “We just kind of say, ‘What’s in here?’”

ocean-floor recording sites

The sources of the ocean-floor recordings (KAITLIN E. FRASIER MARIE A. ROCH MELISSA S. SOLDEVILLA SEAN M. WIGGINS LANCE P. GARRISON JOHN A. HILDEBRAND)

The algorithm sorted the clicks based on a handful of criteria, beginning with variations within a click and then variations in the way multiple clicks were strung together. Clicks were further grouped according to their tonal, or “spectral” shape, by the brief silences within each click, and by the longer ones between clicks.

 

Continue reading the full article: Big think

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s