If the sobering scientific evidence tracking how quickly the glaciers are melting and oceans are warming up doesn’t get you down, then this latest UN report about how human activities are threatening to push 1 million plant and animal species toward extinction likely will. The news is pretty dire, and time is of the essence here: we know that action is needed, but can we act fast enough?
Not surprisingly, new technologies could help pinpoint negative environmental impacts as they unfold in real time around the world, allowing us to take action much more quickly. Over in Australia’s Great Barrier Reef, a team of scientists from Queensland University of Technology (QUT) in Brisbane are now using drones, artificial intelligence and the massive computing power of the cloud to get a better handle on how warming ocean temperatures and acidification are affecting this enormous coral reef — meaning governments and other agencies can act sooner, rather than later.
The researchers are collaborating with the Australian Institute of Marine Science (AIMS), a marine research center that has been monitoring the health of the reef for years — albeit using more conventional methods to collect data and imagery, using airplanes, helicopters, satellites and ocean surveys. While these approaches work, they are costly and the images that are gathered are often low in resolution, and it can take weeks to comb through the data.
That’s where AI and drone technology can come in, allowing researchers to cheaply and quickly collect a large amount of data in a short period of time, in addition to using cloud computing to quickly process the information. To do this, the research team chose a commercially available drone and adapted it to include two cameras: a high-resolution digital camera and a hyperspectral camera. Images taken from the digital camera are taken using light from the red, green, and blue spectra of light — specifically in the 380-to-740-nanometer range of the electromagnetic spectrum — and are then used to construct a three-dimensional model on the computer. In addition to these, the hyperspectral camera receives the reflected light of 270 spectral bands, and can capture information up to 10 feet below the surface of the water.
“Hyperspectral imaging greatly improves our ability to monitor the reef’s condition based on its spectral properties,” as Felipe Gonzalez, an associate professor who is leading the QUT team, explained on IEEE Spectrum. “That’s because each component making up a reef’s environment — water, sand, algae, etc. — has its own spectral signature, as do bleached and unbleached coral.”
But this extra layer of information comes with a catch: while an underwater survey might offer up a few dozen data points, a single hyperspectral image can produce thousands of data points, meaning that one drone survey can generate over a thousand gigabytes of data that would take weeks and even months to analyze on a regular desktop PC.
To tackle this issue, the team turned to Microsoft Azure’s cloud computing service, through a grant offered by the company’s AI for Earth initiative. By leveraging the power of the cloud, the team is now able to quickly and efficiently categorize the various spectral signatures from drone-flight images, shaving the processing time from several weeks down to only two or three days. As one might guess, detecting adverse changes early on is critical to the survival of reef habitats; if it took months for data to be processed, during that time that particular zone may have very well degraded past the point of rehabilitation.
Not only is hyperspectral imaging useful in tracking the health of ecosystems over time, but it’s also now being used in agriculture and forestry for monitoring the development and health of crops and trees, as well as in identifying minerals, mapping, astronomy and military surveillance. For now, the team and AIMS plan to continue monitoring various zones in the reef, with the intention to expand their operations to new areas later this year.
Images: AIMS, QUT