Opening a window on environmental phenomena

In winter, dust and sand from the Sahara, blown westwards over the Atlantic Ocean, saturate the air off Cape Verde and the Canary Islands. © NASA EARTH OBSERVATORY

In winter, dust and sand from the Sahara, blown westwards over the Atlantic Ocean, saturate the air off Cape Verde and the Canary Islands. © NASA EARTH OBSERVATORY

Data collected by satellites, drones, radars and microscopes provide a goldmine of information to better understand our environment. And when these data are coupled with artificial intelligence (AI), they can unlock the secrets of phenomena taking place at all levels. 

In a rapidly changing world where environmental threats abound, obtaining a better understanding of natural and anthropogenic processes can help corroborate points of view, guide conservation and renewal efforts, and orient new research. One key to gaining this kind of understanding is imaging technology. A wealth of data is being captured by satellites, radars, lidars and microscopes; the trick is to pull the different forms of data together and, sometimes with the help of AI, glean valuable insight. Researchers in a number of fields are leveraging the opportunities provided by new imaging technology – such as determining the chemical composition of plants, spoting waste floating in the ocean, quantifing and characterizing precipitation, mapping coral reefs in the Red Sea and assessing the health of large areas of cultivated land – in order to learn more about ecosystems of all sizes.

Freezing plants to get a better look

At EPFL’s Laboratory for Biological Geochemistry (LGB), scientists are studying a range of biological and other processes on a subcellular level: the breakdown of the relationship between microalgae and the coral they live in; the stress caused to plants by salt stress; the reconstruction of past climate conditions based on tiny carbonate shells measuring less than one millimeter in length; and more. The scientists are using various microscopes and other high-tech microanalysis instruments to observe chemical transfers where even slight molecular and ionic variations can disrupt an entire organism and have an impact on a much larger scale.

Taking the example of coral and the thousands of microalgae they house, in what appears to be a perfect symbiotic relationship: the coral feeds off the nutrients released by the microalgae, while the microalgae absorb the CO₂ produced by the coral. But this age-old relationship, which incidentally is what gives corals their shimmering color, is now being threatened by global warming. Higher water temperatures put the microalgae under stress, prompting them to release compounds that are toxic to the coral. The coral responds by eventually kicking them out. This leads to coral bleaching and possibly even coral death. When this happens on a large scale, entire coral reef ecosystems can collapse and cause huge loss of ocean biodiversity. For the past several years, a team of LGB scientists have been using an ion microscope to study the hidden secrets of this symbiotic relationship. “We use a NanoSIMS microscope, which basically bombards the samples with ions,” says Nils Rädecker, a postdoc at LGB. “This lets us observe transfer processes at a very high resolution. We can see individual cells and even subcellular structures.” Using the NanoSIMS, the scientists were able to discover new mechanisms in the breakdown of the symbiosis – like the selfish way in which microalgae stop supplying nutrients to coral well before the coral kicks them out.

This mosaic of individual images taken by a NanoSIMS microscope illustrates ammonium uptake in a cross-section of a sea anemone tentacle with blue and pink colors indicating areas of lowest and highest rates of ammonium assimilation respectively. This technique enables researchers at EPFL to study metabolic interactions at sub-cellular resolution. 2024 EPFL/Nils Rädecker - CC-BY-SA 4.0

“The problem with the NanoSIMS is that most soluble compounds are lost during the required sample preparation,” says Anders Meibom, a professor at LGB. To get around this problem, the scientists patiently developed a CryoNanoSIMS microscope that permits the analysis of biological samples in a frozen state and from which nothing is lost. “The CryoNanoSIMS therefore allows us to image precisely where soluble compounds, such as specific molecules like drugs or micropollutants, get accumulated in individual cells,” says Meibom. The microscope has opened up many new avenues of research. For instance, Priya Ramakrishna, a postdoc at LGB, is using it to produce high-resolution chemical maps of a model plant in order to investigate the cellular response to soil salinity. “Increasing soil salinity affects plant growth and therefore has consequences for food crops. We need to understand how the plants respond to this,” she says.

Images and AI give our planet a voice

The Earth has a surface area of over 196 million km² – plenty of room for ecosystems to thrive far from the beaten path, in remote areas that are impossible for field scientists to reach. Yet sensor-equipped drones, satellites and smartphones form a dense network of data-collection devices that can supply anonymized, usable information. “The satellite we use most, for instance, can take highly detailed images of areas spanning 290 kilometers wide and with a resolution of 10 meters,” says Devis Tuia, a professor at EPFL’s Environmental Computational Science and Earth Observation Laboratory (ECEO). “As images are geolocated, we always know the coordinates of the location we are analyzing.”

Whether it's for studying animal populations, crop distribution and maturity, identifying waste floating on the ocean surface or tracking glacier melting, the potential for using imaging technology to observe and monitor the environment is huge. “Every problem has its own sensor and its own preferred resolution. Also, the data available are very heterogeneous. We employ standard information-extraction algorithms and AI to sort, catalogue, search and process those heterogenous, unstructured datasets and turn them into useful, structured information,” says Tuia. His research group has recently developed an AI program for the rapid 3D mapping of corals – organisms known to play an essential role in marine ecosystems – based on sequences filmed by commercially available cameras. With this technology, even divers with no special training can easily collect data on large coral reefs.

And then there are satellite data. These types of images still have a lot of untapped potential, and researchers often find they have to train elementary image recognition programs from scratch with the limited data available for a specific field. “Until now, no program existed that could quickly switch from recognizing a piece of waste to recognizing a tree or a building, for example,” says Tuia. He and his team, together with colleagues at Wageningen University in the Netherlands, MIT, Yale and the Jülich Research Center in Germany, have developed a chameleon application called METEOR that can train algorithms to recognize new objects after being shown just a handful of good-quality images, and a meta-learning algorithm. A huge time-saver where field data acquisition is difficult or very costly.

Cloud profiling

Meanwhile, scientists at EPFL’s Environmental Remote Sensing Laboratory (LTE) are exploring why no two snowflakes – or raindrops for that matter – are alike. They’re monitoring precipitation and studying clouds around the world, including in the Alps, Antarctic, Arctic and Greece, with the help of radars, lidars and a special device for taking 3D pictures of snowflakes. “Imaging is the only way we can observe changing weather phenomena across time and space and at many different scales,” says Alexis Berne, a professor at LTE. Even today, researchers struggle to obtain accurate, reliable quantitative data on precipitation, especially when it’s in solid form and in mountainous and polar regions. Yet such data can go a long way towards preserving water resources, predicting natural disasters and evaluating the effects of climate change in highly sensitive regions.

Supernumerary crystals

A lot also remains to be learned about how water droplets and ice crystals form inside clouds. While the mechanism of condensation around certain aerosols - solid or liquid particles suspended in the atmosphere - serving as so-called “glaciogenic” nuclei is well known, a second process, secondary ice, still conceals an element of mystery. When researchers pointed their radars at clouds to quantify precipitation formation, droplets and crystals far outnumbered aerosol particles. The numbers didn’t add up. “We still aren’t really sure how this secondary ice process works,” says Berne. His lab, along with others from EPFL (the Extreme Environments Research Laboratory and the Laboratory of Atmospheric Processes and their Impacts) will take part in a large EU-funded project to conduct cloud profiling at different locations around the world. The goal is to observe the behavior of cumulonimbus and other cloud families. “Here, computer modeling will also help us better understand the ambient conditions in which we make our observations,” says Berne.

Images derived from electromagnetic waves

However, Berne adds that: “We don’t perform the kind of image analysis that’s done in biomedical imaging, for example.” The radars used by scientists in his field produce dozens of gigabytes of data each day, which are analyzed to perform case studies of specific weather phenomena and generate statistics. “The factors we’re most interested in studying are generally those observed indirectly,” says Berne. “Lidars and radars operate with electromagnetic waves and we measure the electromagnetic properties of objects in real time. Our work focuses on restitution algorithms that will enable us to extract information concerning the microphysical properties of cloud particles to better understand the mechanisms involved and quantify precipitation more precisely.”


Author: Cécilia Carron

Source: Environmental Computational Science and Earth Observation Laboratory

This content is distributed under a Creative Commons CC BY-SA 4.0 license. You may freely reproduce the text, videos and images it contains, provided that you indicate the author’s name and place no restrictions on the subsequent use of the content. If you would like to reproduce an illustration that does not contain the CC BY-SA notice, you must obtain approval from the author.