An official website of the United States government
Here’s how you know
Official websites use .gov A
.gov website belongs to an official government
organization in the United States.
Secure .gov websites use HTTPS A
lock (
) or https:// means you’ve safely connected to
the .gov website. Share sensitive information only on official,
secure websites.
Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea
Completed
Primary Goal
Develop technology for underwater vehicles to find, follow, and identify seafloor and water column animals in underwater video in real time
Dates
October 22-23, 2024
January 22, 2025
July 7-11, 2025
Location
Pacific Ocean: Monterey Bay, California
Vessel
Research Vessel Rachel Carson (2024, January 2025) Research Vessel Paragon (July 2025)
Primary Technology
Artificial intelligence, remotely operated vehicle MiniROV, long range autonomous underwater vehicle Triton, Triton imaging system
Overview
For the past two years, engineers and scientists have been developing a way for artificial intelligence-driven underwater vehicles to autonomously find, follow, and identify deep-sea animals in real time, with limited human oversight. Their Deployable AI (artificial intelligence) will help scientists, policymakers, and the public better and more rapidly understand the life that inhabits our ocean.
Screenshot from the FathomNet Database, a publicly available underwater image database used to train the Deployable AI developed to find, follow, and identify deep-sea animals as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. The jellyfish seen here was observed during an expedition led by NOAA Ocean Exploration, one of a number of contributors to the database. Image courtesy of MBARI/FathomNet, NOAA Ocean Exploration. Download largest version (jpg, 160 KB).
Modern robotics, low-cost observation platforms, and other emerging exploration tools make underwater imaging easier. However, searching for and following animals, and then analyzing all the resulting imagery, takes a lot of effort. This new technology aims to overcome these major obstacles to discovery.
How It Works
The Deployable AI technology consists of hardware (cameras and a compact computer) and software (with several computational algorithms) to enable detection and tracking of underwater animals by a remotely operated vehicle (ROV) or autonomous underwater vehicle (AUV). During a dive, “detector” and “supervisor” algorithms review live video, looking for animals they’ve been trained to recognize (e.g., fish, jellyfish, siphonophores, and comb jellies), similar to facial recognition.
Once they find something of interest, another “agent” algorithm works with the vehicle controls algorithm to maneuver the vehicle and follow the animal slowly and far enough away to avoid disturbing it, while continuing to image it as long as possible.
Initial Testing
To ensure the agent could perform these tasks effectively, it underwent additional training in a simulated environment with an ROV and animals — much like playing a video game. After the team was satisfied with the agent’s performance, they integrated it on MBARI’s (Monterey Bay Aquarium Research Institute’s) MiniROV — since ROVs offer greater control for testing purposes — and tested and refined it, teaching it to follow an artificial jellyfish mimic in a 10-meter-deep (30-feet-deep) test tank.
Remotely operated vehicle MiniROV in a simulated environment during the training of the Deployable AI developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. Image courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI. Download largest version (jpg, 405 KB).
Next Stop, the Ocean
In October 2024, the agent was ready for real-world testing in Monterey Bay. Deployed on MiniROV from Research Vessel Rachel Carson, it performed both tasks relatively well — finding and following siphonophores, comb jellies, and jellyfish — but still needed work.
Screenshots captured in the control room of Research Vessel Rachel Carson during the first field test on remotely operated vehicle MiniROV of the Deployable AI developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. Image courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI.Download largest version on the left (jpg, 2 MB).Download largest version on the right (jpg, 2 MB).
Back on land, the team continued to improve the agent in preparation for another test in Monterey Bay in January 2025.
A siphonophore, one of the many biological targets followed during the second field test on remotely operated vehicle MiniROV of the Deployable AI developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. This screenshot was taken from the telepresence feed on Research Vessel Rachel Carson. Image courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI. Download largest version (jpg, 625 KB).
Pleased with the results of the second test and confident the agent could follow animals on its own, the team turned their attention to developing a specialized stereo imaging system with a compact computer to work with it. This system is unique because it can operate at depths up to 1,500 meters (almost a mile) for weeks at a time, has a wide field of view, uses strobe lighting to conserve power, and can run AI algorithms to automate decision making on AUVs.
In July 2025, they headed back to sea on Research Vessel Paragon to test the new Triton imaging system on long range autonomous underwater vehicle (LRAUV) Triton with a program specifically trained to detect siphonophores. During a single 69-hour deployment, the LRAUV performed a series of pre-programmed dives in a “yo-yo” pattern (descending and ascending between maximum and minimum depths for specified durations), autonomously documenting several siphonophore species. The quality of the imagery enabled species identification for animals at least 1 centimeter (0.4 inches) in size.
Map showing the location of the deployment of long range autonomous underwater vehicle Triton during the fieldwork in July 2025 for the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. The white line indicates the vehicle’s main transect along Monterey Canyon in Monterey Bay. Map courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI
Download largest version (jpg, 514 KB)
Long range autonomous underwater vehicle Triton on the deck of Research Vessel Paragon following testing of the Triton imaging system (on the nose cone) developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. Image courtesy of Korneel Somers/MBARI.
Download largest version (jpg, 1 MB)
Bringing It All Together
Through their series of tests, the team confirmed that their supervisor, detector, agent, and imaging system work as intended. The next step is to test them together, which will require some changes to the vehicle itself. In the meantime, the team will continue to refine their approach, enabling the vehicle to make decisions about what animals to follow based on specific research interests and exploring potential improvements to the imaging system. They also have approximately 170 hours of video (three cameras, 56 hours each) to analyze and annotate. Annotations will be added to the FathomNet Database, a publicly available database of annotated images to help anyone from from experts to enthusiasts analyze ocean imagery.
A comb jelly (cydippid ctenophore) imaged during the testing on a long range autonomous underwater vehicle of the Triton imaging system developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. Image courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI. Download largest version (jpg, 522 KB).
This project shows great promise for expanding the capabilities of uncrewed underwater vehicles and advancing the study of deep-sea animals. By automating the collection and real-time analysis of large volumes of underwater imagery and enabling underwater vehicles to autonomously adapt their direction and speed to target and continuously follow animals, we will be better equipped to explore our ocean and discover and understand the life that lives there.
Automated tracking of a jellyfish (Stellamedusa sp.) during the second field test on remotely operated vehicle MiniROV of the Deployable AI developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. This genus of jellyfish is relatively rare in Monterey Bay. Video courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI. Download largest version (mp4, 9 MB).
Multimedia
The images and video from this project add context and help bring the project to life. Click on a preview image below to view the full image or watch the video and get more information.
Ocean Science for Educators provides the best of what the NOAA Ocean Exploration website has to offer to support educators in the classroom. Each theme page includes lessons, fact sheets, ocean facts, exploration notes, multimedia, and related past expeditions and projects. Below is the top education theme related to this project.