
AI Meets the Deep: Our Journey to 3rd Place in the 2024 Exploration Challenge
With the annual MATE ROV competition all finished up, Sunk Robotics placed 3rd worldwide in terms of robotics, and 9th overall including the marketing and advertising campaigns. A good run indeed. Our robot performed well, although reliability is our priority for next year’s attempt.
A New Competition
Although MATE ROV offers its well-known underwater competition, there are other events they host as well. This year, Matty Harris and I undertook a secondary challenge, known as the 2024 Ocean Exploration Video Challenge (OER) where teams must create custom Artificial Intelligence models that can be used to identify underwater species from a provided video. In our case, we had to identify brittle stars from a minute long video.
You can find the official (copy) PDF from MATE here.
Collecting Data
We started out by looking for datasets that could potentially be used to give our AI model some sense of what it’s looking for. We stumbled upon a few, but most were in a format not usable by us. Eventually, we found a dataset under the National Library of Medicine, and this was what we needed. Although the information was in a mosaic format, we were able to convert them into annotated images by hand labeling them using a custom-written python program.
Brittle star sample before annotation from NLM database.
Annotating a Thousand Frames
Without any annotated images for the AI model to learn from, the neural network was really just a dumb box. The dataset from NLM only provided masked images, which was not enough to get the model working with. Because we chose the YOLO v8 framework for our model, we needed to annotate each brittle star in each frame with a bounding box. To our surprise, we found that a single annotated frame fed into the training of the model was sufficient enough to generate accurate results.