Global warming and pollution are causing severe stress to coral reefs across the world.

Researchers from the University of California Berkeley and University of Queensland developed a deep learning process that automatically analyzes reef photos that will help measure reef health and changes over time. Reefs provide food and shelter for more than a quarter of all marine species. They also support fish stocks that feed more than a billion people and provide jobs to millions of people in coastal areas.

The new technology “will allow the world’s scientists to more quickly assess the health of coral reefs at scales never dreamed of before,” said Ove Hoegh-Guldberg, chief scientist of the global reef record and a professor at the University of Queensland. With that information, they can more effectively take steps to protect and save them.

Oscar Beijbom, a postdoctoral scholar at Berkeley, used TITAN X GPUs and the Caffe deep learning framework to train their image recognition system to identify 40 different categories of corals, sponges, algae, and other elements, which achieves a 900x speedup over previous methods.

Photo of a coral reef automatically analyzed by deep learning (Image courtesy of the XL Catlin Global Reef Record)

The program is able to identify as many as 40 categories of corals, sponges, algae, and other organisms.

All of the coral reef images—nearly 225,000—are hosted on the CoralNet web portal and Beijbom will soon be adding the deep learning system to the platform, which will help coral reef ecologists take advantage of automated analysis.
Read more >>