Artificial Intelligence (AI) can track the health of coral reefs by learning the “song of the reef”, new research shows.
In the new study, University of Exeter scientists trained a computer algorithm using multiple recordings of healthy and degraded reefs, allowing the machine to learn the difference.
The computer then analyzed a host of new recordings, and successfully identified reef health 92% of the time.
The team used this to track the progress of reef restoration projects.
“Coral reefs are facing multiple threats including climate changeso monitoring their health and the success of conservation projects is vital,” said lead author Ben Williams.
“One major difficulty is that visual and acoustic surveys of reefs usually rely on labor-intensive methods.
“Visual surveys are also limited by the fact that many reef creatures conceal themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings.
“Our approach to that problem was to use” machine learning—to see whether a computer could learn the song of the reef.
“Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing.”
The fish and other creatures living on coral reefs make a vast range of sounds.
The meaning of many of these calls remains unknown, but the new AI method can distinguish between the overall sounds of healthy and unhealthy reefs.
The recordings used in the study were taken at the Mars Coral Reef Restoration Project, which is restoring heavily damaged reefs in Indonesia.
Co-author Tim Lamont, from Lancaster University, said the AI method creates major opportunities to improve coral reef monitoring.
“This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs, and discover whether attempts to protect and restore them are working,” Lamont said.
“In many cases it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it—especially in remote locations.”
The paper is published in the journal Ecological Indicators.
Ben Williams et al, Enhancing automated analysis of marine soundscapes using ecoacoustic indices and machine learning, Ecological Indicators (2022). DOI: 10.1016/j.ecolind.2022.108986
University of Exeter
Citation: AI learns coral reef ‘song’ (2022, May 27) retrieved 27 May 2022 from https://phys.org/news/2022-05-ai-coral-reef-song.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.