A cylindrical orange and yellow AUV on a boat deck with ocean waves and an overcast sky in the background.

Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea

Past Expedition

Primary Goal

Develop technology for underwater vehicles to find, follow, and identify seafloor and water column animals in underwater video in real time

Dates
October 22-23, 2024
January 22, 2025
July 7-11, 2025
Location
Pacific Ocean: Monterey Bay, California
Vessel
Research Vessel Rachel Carson (2024, January 2025)
Research Vessel Paragon (July 2025)
Primary Technology
Artificial intelligence, remotely operated vehicle MiniROV, long range autonomous underwater vehicle Triton, Triton imaging system

Overview

For the past two years, engineers and scientists have been developing a way for artificial intelligence-driven underwater vehicles to autonomously find, follow, and identify deep-sea animals in real time, with limited human oversight. Their Deployable AI (artificial intelligence) will help scientists, policymakers, and the public better and more rapidly understand the life that inhabits our ocean.

FathomNet
Screenshot from the FathomNet Database, a publicly available underwater image database used to train the Deployable AI developed to find, follow, and identify deep-sea animals as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. The jellyfish seen here was observed during an expedition led by NOAA Ocean Exploration, one of a number of contributors to the database. Image courtesy of MBARI/FathomNet, NOAA Ocean Exploration.
Download largest version (jpg, 160 KB).

Modern robotics, low-cost observation platforms, and other emerging exploration tools make underwater imaging easier. However, searching for and following animals, and then analyzing all the resulting imagery, takes a lot of effort. This new technology aims to overcome these major obstacles to discovery.

How It Works

The Deployable AI technology consists of hardware (cameras and a compact computer) and software (with several computational algorithms) to enable detection and tracking of underwater animals by a remotely operated vehicle (ROV) or autonomous underwater vehicle (AUV). During a dive, “detector” and “supervisor” algorithms review live video, looking for animals they’ve been trained to recognize (e.g., fish, jellyfish, siphonophores, and comb jellies), similar to facial recognition.
Once they find something of interest, another “agent” algorithm works with the vehicle controls algorithm to maneuver the vehicle and follow the animal slowly and far enough away to avoid disturbing it, while continuing to image it as long as possible.

Initial Testing

To ensure the agent could perform these tasks effectively, it underwent additional training in a simulated environment with an ROV and animals — much like playing a video game. After the team was satisfied with the agent’s performance, they integrated it on MBARI’s (Monterey Bay Aquarium Research Institute’s) MiniROV — since ROVs offer greater control for testing purposes — and tested and refined it, teaching it to follow an artificial jellyfish mimic in a 10-meter-deep (30-feet-deep) test tank.

MiniROV Simulation
Remotely operated vehicle MiniROV in a simulated environment during the training of the Deployable AI developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. Image courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI.
Download largest version (jpg, 405 KB).

Next Stop, the Ocean

In October 2024, the agent was ready for real-world testing in Monterey Bay. Deployed on MiniROV from Research Vessel Rachel Carson, it performed both tasks relatively well — finding and following siphonophores, comb jellies, and jellyfish — but still needed work.

Siphonophore
Siphonophore

Screenshots captured in the control room of Research Vessel Rachel Carson during the first field test on remotely operated vehicle MiniROV of the Deployable AI developed as part of the Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea project. Image courtesy of Deployable Artificial Intelligence for Exploration and Discovery in the Deep Sea/MBARI.
Download largest version of the left image (jpg, 2 MB), Download largest version of the right image (jpg, 2 MB).

Multimedia

The images and video from this project add context and help bring the project to life. Click on a preview image below to view the full image or watch the video and get more information.

Education

Ocean Science for Educators provides the best of what the NOAA Ocean Exploration website has to offer to support educators in the classroom. Each theme page includes lessons, fact sheets, ocean facts, exploration notes, multimedia, and related past expeditions and projects. Below is the top education theme related to this project.

Team

Each team member’s path to this expedition is unique. Read their bios to find out what makes them ocean explorers.

Principal Investigator, Principal Engineer, MBARI (Monterey Bay Aquarium Research Institute)
Associate Professor of Biology, University of Dallas

Resources & Contacts