Researchers have unveiled a new AI model that can transform hard-to-see underwater images into clear, highly accurate 3D scenes. It can help ecologists more accurately observe—and understand—environments like coral reefs.
Researchers at the Massachusetts-based Woods Hole Oceanographic Institution (WHOI) and Massachusetts Institute of Technology designed the model, named SeaSplat, to offset water’s two primary distorting effects on photos: haze and discoloration.
SeaSplat reconstructs what an underwater scene would look like if the water and its distortions were removed. It can transform photos that appear colorless or washed out into bright, sharp images that reflect an object or animal’s true colors.
The model also can generate accurate 360-degree reconstructed images. A technique called 3D Gaussian splatting helps the model precisely predict and digitally recreate full 3D images from two-dimensional, underwater photos.
Using large numbers of underwater images collected by divers and robots, the researchers tweaked the 3D reconstruction model to automatically correct for water’s inherent distortions and depict underwater objects as they’d appear on dry land. It takes into account the 3D structure of the scene—along with the geometry and relative position of every object—to create true-to-life color.

Researchers hope SeaSplat will help ecologists better understand coral reefs’ rich biodiversity, giving scientists better tools to protect these environmentally critical habitats. Reefs are often called the “rainforests of the sea” for the important role they play in nurturing underwater biodiversity.
Over the past two years, about 84% of the ocean’s reefs have experienced harmful bleaching, according to the International Coral Reef Initiative, a global partnership among nations and organizations. Since 1998, there have been four global bleaching events that have devastated reef ecosystems.
“Reefs are a small part of the ocean but are home to a huge amount of biodiversity, so it’s extremely important to monitor coral reef ecosystems,” said Yogesh Girdhar, an associate scientist at WHOI, and co-creator of the model. “Scientists can use this model to quantify the biodiversity of reefs and detect specific events like coral bleaching or disease.”
Girdhar and co-author Daniel Yang, an MIT graduate student, presented their work on SeaSplat at a robotics conference in mid-May, just weeks before the June 1 annual World Reef Awareness Day. John Leonard, an MIT professor of mechanical engineering, also co-authored the study.
To capture underwater imagery for training the model, the researchers used a submersible underwater robot outfitted with NVIDIA Jetson Orin edge compute to help guide the craft.
Since its initial training run—which was powered by NVIDIA L40 GPUs—the model can be used for inferencing images taken by standard underwater cameras, not specialized equipment or fancy lighting.
Ecologists upload raw underwater images, and the model creates corrected versions that restore natural colors and sharper details. For example, SeaSplat can accurately add in back reds and yellows, which are usually washed out in underwater photographs. It also brings out or recreates fine features in corals or marine organisms.
“Our goal is to help ecologists get higher resolution of the sea floor and to better understand coral reefs,” Girdhar said. “Because we have a very high quality 3D model, we can move the camera around virtually anywhere, and render a reconstructed image from virtually any point of view that is extremely close to what the actual object is.”
So far, SeaSplat has been used to analyze and upgrade underwater reef images taken in the U.S. Virgin Islands, the Red Sea, and Curaçao. Going forward, the researchers want to make the model more generalizable and scalable so that it can be used in virtually any underwater survey or study.
Read additional coverage of SeaSplat and check out the researchers’ paper.