

The aim of Reefscapers’ coral propagation program is to help corals repopulate the reef faster after massive bleaching events, so it is ready to deal with the next one. Indeed, as big El Niño events increase due to climate change, the corals might soon not have enough time to recover between successive events.
To improve the efficiency of our coral propagation mission and make it more feasible on a large scale, we need to better understand how corals grow. As we learn which parameters are best for coral growth, we can increase the survival rate of our transplanted colonies, and improve the impact of our work and resources. This is why we have been taking monitoring pictures of our frames, every 6 months from when they are first placed; this, however, represents a huge amount of work, which quickly limits the number of colonies we can transplant.
To bypass these limitations, we are building an autonomous catamaran that will take the pictures for us. Using solar energy exclusively, it will drive around our islands using GPS and cameras to navigate its way to our frames. Using the latest advances in robotics, a Pixhawk autopilot will control the propellers, and send the catamaran to the points commanded by an onboard navigation computer (Raspberry Pi). This lightweight computer will be in charge of processing the live video feed to identify the catamaran’s surroundings. Along with the GPS signal, this will allow very precise location management, which will be essential to monitor objects as small as a coral frame.
This device will allow us to collect more pictures than before to grow our database faster and more efficiently, while opening the door to advanced artificial intelligence analyses. We also plan to share the design and software with other scientists who may use it to conduct surveys, similar monitoring missions, or adapt it to their own specific needs.
Reefscapers has now transplanted more than 8,000 coral frames in the Maldives, each of them supporting 30 to 110 fragments of coral.
How are they doing? Which species grow better? Are there places where they have better chances of survival?
To answer all these questions and many more, we have been collecting pictures of our frames every 6 months. And our autonomous catamaran will help us take a lot more! But what to do with more than 200,000 pictures?
For each fragment, we want to automatically detect what kind of coral it is, its size, and its health. Obviously, these observations are limited by the quality of our pictures. For instance, we cannot expect to identify the species from so far away. But we can still differentiate between larger families, such as Pocillopora or branching Acropora.
To achieve these goals and more, we are now using the latest in deep learning to automatically extract the valuable information from these pictures, using something called a convolutional neural network. This is a mathematical model that imitates neural connections from the brain and learns to detect objects in pictures.
First, we have to teach the software what we want it to recognise: coral frames, Acropora fragments, dead fragments, and so on… Once we show it enough training pictures, it is able to use this knowledge to infer the presence of such objects in new pictures that it hasn’t seen before. In the future, we also hope to achieve a species identification model to provide other researchers with a reliable way to classify the corals that they work with.
To help save the coral reefs, we are putting robots and artificial intelligence to work !

Visit Reefscapers AI 4 Corals for more photos and videos, plus all the latest project updates.