Computerworld

Flying, diving drones trial to automate Great Barrier Reef surveys

Nowhere too dark, deep or dangerous for robotic vehicles

Researchers from the Australian Institute of Marine Science (AIMS) have this week returned from a 10 day mission on the Research Vessel Cape Ferguson during which they trialed the use of underwater and aerial drones to conduct automated surveys of the Great Barrier Reef.

“As globally coral reefs are under enormous pressures, the need to monitor and to observe gets bigger and bigger,” explained AIMS’ data automation technology and imaging specialist Scott Bainbridge. “But at the moment we’re using people and expensive people and expensive boats, so what we’re looking for is some kind of force multiplier to be able to do more with less. So one of the obvious solutions is to use automated systems.”

In one of the trials, researchers fitted the institute’s six-thruster Blue ROV (Remote Operated Vehicle) with a hyperspectral camera, and programmed it to follow a ‘transect line’ 20 metres below the surface.

A transect line is essentially a length of coloured tape laid on the seabed. Traditionally a diver follows the line and take photographs at regular intervals for later analysis.

“What we can do with a robot, is basically show it the start of the transect, and it will find the tape. And it will follow that transect tape to the end, using it as a guideline. Once it knows where it is it can navigate away from it for a short distance and do parallel transects and do them from different depths,” Bainbridge told Computerworld.

The Blue ROV being prepared for launch
The Blue ROV being prepared for launch

The bot captured video footage and stills – including stereo pairs used to create a 3D map of an area – from the multiple cameras aboard the robot.

The main measures the researchers and their robots record are the percentage cover of coral in an area and the scene’s ‘structural complexity’. The complexity scale goes from sparse, flat sand, up to complex coral structures. It’s an important one because “areas that are more complex tend to be more biodiverse and support a larger number of species,” Bainbridge explains.

The stills are run through AI routines which identify the major components in the images like sand, coral, and sponges. It then serves up the most interesting images to researchers who manually identify the different species in each.

“As the AI gets better we hope to do more and more of that through the AI rather than using people,” Bainbridge says.

The ROV was also trialled for the first time in Australia at night. Observing the reef in the dark is useful because some species are more active at night, and it also allows researchers to record corals and sponges that fluoresce.

It can also descend to greater depths, up to 100m, than can be easily reached by human divers.

“There’s a whole hidden part of the Barrier Reef we know very little about, between the shallows. We need to understand those. With temperature regimes changing some shallow species may go deeper so we need to understand how the deep and shallow water corals interact,” Bainbridge says.

The transect line technique still requires divers, who use the dive time to make more detailed observations, while the bot completes the survey. For the trial, divers made the same survey as the bot, to ensure the quality is matched.

“Where we’re going is to be fully autonomous, so the robots can navigate by themselves,” Bainbridge says. “Although there are lots of challenges we need to solve to do that. The problem is you don’t have GPS underwater, also things like shallow coral reefs are very dynamic so there’s lots of waves and surge and we don’t have maps and we don’t have the imagery you have for terrestrial systems.”

The use of bots instead of divers also allows more area to be observed, particularly crocodile and jellyfish infested waters, Bainbridge added.

The other aim of the mission was to trial the ability of aerial drones to conduct reef surveys.

AIMS' marine technical officer Joe Gioffe and hyperspectral specialist Jonathan Kok with the drone
AIMS' marine technical officer Joe Gioffe and hyperspectral specialist Jonathan Kok with the drone

“We did some revolutionary stuff, we flew a 25kg hyperspectral camera with our large drone off our research vessel, and we flew it over a coral transect on John Brewer Reef, which is one of our long term monitoring sites,” AIMS technology team leader Melanie Olsen told Computerworld.

The drone is able to observe reef structure and also measure coral bleaching. Previously aerial surveys were done using images taken from a light aircraft.

“It will allow for the acceleration of data collection and processing,” Olsen added.

Necessity, not novelty

AIMS is an enthusiastic adopter of robotic systems. In October it used a ‘surfing robot’ to complete a seven-day, 200 nautical mile trial voyage of the central reef.

The test expedition saw the vessel collect a range of meteorological and oceanographic data and relay it back to shore in real-time.

The institute’s RangerBots meanwhile use computer vision to navigate underwater and inject crown-of-thorns starfish – a reef-eating coral predator that can reach plague proportions – with a deadly solution to control numbers.

The use of autonomous systems is not a novelty, but a necessity for the institute, Bainbridge said.

“We have no choice because the scale of the problem we’re trying to deal with is beyond our physical human resources. The Great Barrier Reef is the size of Italy. And so if you want the status of vineyards in Italy – that’s not a trivial problem. We’re trying to do the same sort of thing,” he said.

“We want to remain globally competitive and so we are preparing to boost our technological capabilities and we are exploring robotics, which can help us more effectively monitor large sections of the reef,” added Olsen.

The drone set to take flight from the RV Cape Ferguson
The drone set to take flight from the RV Cape Ferguson