TNS
VOXPOP
Will JavaScript type annotations kill TypeScript?
The creators of Svelte and Turbo 8 both dropped TS recently saying that "it's not worth it".
Yes: If JavaScript gets type annotations then there's no reason for TypeScript to exist.
0%
No: TypeScript remains the best language for structuring large enterprise applications.
0%
TBD: The existing user base and its corpensource owner means that TypeScript isn’t likely to reach EOL without a putting up a fight.
0%
I hope they both die. I mean, if you really need strong types in the browser then you could leverage WASM and use a real programming language.
0%
I don’t know and I don’t care.
0%
Operations

AI Biodiversity ‘Time Machine’ Could Decide for Uncertain Future

A team of researchers from the University of Birmingham are now suggesting that AI could also be leveraged to potentially solve some of the biggest environmental problems that humanity is now facing, such as the serious threats to biodiversity due to pollution and habitat loss stemming from extractive human activities like mining and logging.
Dec 25th, 2021 3:00am by
Featued image for: AI Biodiversity ‘Time Machine’ Could Decide for Uncertain Future
Images: Tima Miroshnichenko via Pexels; University of Birmingham

The power of artificial intelligence has become increasingly evident in many aspects of our lives, whether it is the algorithms driving recommendation engines and personal assistants, or the complex computations that propel new discoveries in life-saving pharmaceutical drugs.

A team of researchers from the University of Birmingham are now suggesting that AI could also be leveraged to potentially solve some of the biggest environmental problems that humanity is now facing, such as the serious threats to biodiversity due to pollution and habitat loss stemming from extractive human activities like mining and logging.

The research team is proposing what they call the “Time Machine framework,” which will use AI to help decision-makers and experts from different disciplines to “go back in time” (so to speak) in order to compile pre-existing environmental data in a way that would allow them to quantifiably observe how environmental changes have impacted biodiversity and various ecosystems over long periods of time. A better understanding of such historical pieces of data over time will help better predict what kinds of mitigation strategies would work best.

The team’s proposal, which was recently published in “Trends in Ecology and Evolution,” outlines how AI could be used to aid experts in determining the best course of action when it comes to managing the impacts of biodiversity loss.

“Biodiversity loss happens over many years and is often caused by the cumulative effect of multiple environmental threats,” explained Dr. Luisa Orsini, the paper’s principal investigator, and an associate professor who teaches about biosystems and environmental change at University of Birmingham. “Only by quantifying the system-level biodiversity change before, during, and after pollution events, can the causes of biodiversity and ecosystem service loss be identified.”

Orsini pointed out that the tools that are now widely being used fall short of what is needed, as they are often based on short-term observations of so-called indicator species — or organisms that serve as bellwethers of the environmental conditions of an existing ecosystem. While such short-term observations might give a relatively accurate snapshot of ecosystem changes that are happening in the present day, they don’t adequately provide a comprehensive assessment of the deeper, long-term causes that might be driving broader changes to the biodiversity of a local ecosystem. In contrast, the team believes that their AI-driven approach would help fill that gap.

“Lack of understanding of the processes that underpin ecosystem services has often led to mismanagement with clear dis-benefits for the environment, the economy and human well-being,” Orsini added. “Systemic approaches, like the framework presented here, enable the prioritization of interventions that accelerate ecological restoration, and mitigate environmental factors that cause harm to species associated with key ecosystem functions and services. Protecting every species is impossible. The Time Machine framework offers a way to prioritize conservation of taxonomic groups that deliver services and to guide regulatory interventions to limit the impact of the most harmful environmental contaminants.”

Understanding the Bigger Picture

The idea behind this AI-powered framework is to establish a more holistic understanding of how human interventions are impacting ecosystems across different scales — whether that is spatial, temporal, or over a variety of economies. Such a wide-ranging, “big picture” view into the past and present, would better equip governments and organizations with the data they would need to ensure the “sustained delivery” of ecosystem services — basically, the things like clean air and water, dependable food production and resources — all of which we take for granted that nature will provide.

In addition, the framework would also transcend the narrow boundaries that different research disciplines might place on measuring biodiversity loss, which can sometimes have ecological, sociocultural and economic implications that may not be immediately apparent.

“Discipline-constrained approaches may neglect process interactions, result in research undertaken at inappropriate or disconnected scales, or use discipline-specific tools that are inadequate to address cross-disciplinary questions,” noted the team. “Decision-making frameworks that enable the prioritization of interventions for the sustainable use of ecosystems typically require multiple lines of evidence from different disciplines, making decisions by stakeholders challenging, especially when relationships between socioeconomic and ecological priorities are not linear.”

To test out their proposal, the researchers applied AI-based tools t0 freshwater ecosystems, due to their diversity, wide geographic distribution, and because they are increasingly threatened by pollution and degradation. In particular, their study focused on a lake for which they had extensive ecological and biological data.

“We applied the AI approach and determined that the decline in a specific taxonomic group of primary producers (e.g. green algae) was inversely correlated with ten herbicides among the hundreds that were quantified in the sediment,” explained Orsini. “This pilot study proved that the framework can be effectively applied to prioritize conservation and identify the most harmful contaminants for regulatory interventions.”

The team’s method involved five distinct steps that range from taking sedimentary samples from different time periods, to “fingerprinting” biochemical and ecosystem functions, so that any biotic and abiotic shifts can be detected analyzed with metabarcoding environmental DNA and mass spectrometry, in addition to using public databases of environmental data as benchmarks.

AI is then used to determine any links and changes between past and present biodiversity. “Our approach uses explainable network models combined with multiview learning to allow the simultaneous interrogation of different data matrices, to learn what components co-vary within a matrix (e.g. environmental pollutants), and among matrices (e.g. environmental pollutants and molecular operational taxonomic units or MOTUs),” explained Orsini.

“The Time Machine framework ‘learns’ from past correlations, tested iteratively against long-term empirical data collected from geological records of inland waters, and refined to predict the future biodiversity under different climate and pollution scenarios.”

Of course, with so many co-varying factors, it is important that the team streamlined their approach somewhat. To that end, the researchers used an emulator, which enables the system to use computational resources in a parallel fashion.

“Generating predictions that account for all possible scenarios is computationally intensive and time-consuming,” said Orsini. “The Time Machine framework uses emulators to provide robust predictions with calculated uncertainties across multiple scenarios while reducing computational cost and time. An ‘emulator’ is a low-order, computationally efficient model which emulates the specified output of a more complex model in function of its inputs and parameters.”

Ultimately, the team said that a shift toward more evidence-based, systemic approaches are needed in current environmental practices so that issues are identified and quantified earlier and more comprehensively, though there is the possibility of integrating other useful features as well.

“The Time Machine framework can be, in principle, extended beyond predictions based on the ecological and functional status of ecosystems,” suggested Orsini. “By coupling ecological and economic modeling, the framework can also enable the alignment of socio-economic and ecological outcomes under different climate and pollution scenarios. To overcome adoption barriers by stakeholders, an AI-based emulator dashboard can be developed, accessible to regulators and policymakers through data visualization techniques. These tools can be adapted for probabilistic predictions of ecosystem services to aid decision-making and socio-economic trade-offs.”

Read more in the team’s paper.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.