EPFL now has a dedicated hub for support in image analysis

Edward Andò © 2022 Alain Herzog

Edward Andò © 2022 Alain Herzog

EPFL researchers facing image-analysis problems or challenges now have a dedicated point of call: the Image Analysis Hub at the EPFL Center for Imaging. We spoke with Edward Andò, who’s been leading the hub since it opened a few months ago.

Imaging techniques play an essential role in a host of disciplines, whether to spot a microcrack in a material, detect an exoplanet or observe a protein’s reaction, for example. And even though many different instruments are used in scientific imaging – telescopes, electron microscopes, CAT scanners, synchrotrons and more – the researchers using them all run into similar problems when it comes to analyzing the data they generate. Whether they are seeking to obtain a clear picture of how bacteria behave, a planet rotates, or animal moves based on images taken by a drone hundreds of meters in the air don’t vary much from one instrument to the next and often rely on the same mathematical models. In order to help these researchers analyze their imaging data quickly and reliably, the EPFL Center for Imaging has set up a support hub led by Edward Andò, an engineer who has built up extensive image analysis know-how through his work on both measurement systems and methods for crunching data.

Watching stressed grains of sand dance

In some cases, researchers form an emotional attachment with the machines they work with. That was true for Andò and the X-ray microtomography machine he used for his PhD thesis at the 3SR Laboratory in Grenoble. His eyes glimmer with a hint of affection and almost nostalgic admiration for the instrument, which lets scientists observe in 3D how samples behave in response to stimuli. By developing several advanced measurement and analysis methods, Andò was able to generate images of individual grains of sand as they rotate. “At first we learned as we went along, running experiments on machines that are nothing like those employed today,” he says. “The instrument I cut my teeth on was one of the first models of its kind.” While it may sound like Andò is describing a period in the distant past, those first machines actually appeared less than a decade ago. And today developers have taken the technology even further, largely through advances in analysis software.

Adding neutron imaging to x-rays

Having obtained a position as a research engineer at the CNRS, Edward Andò assisted the creation of new equipment at the Laue Langevin Institute, moving "to the next dimension with the development of a bi-modal neutron and ray tomograph x", he jokes. “Neutrons are extremely sensitive to hydrogen, meaning they can be used in powerful techniques for observing even the tiniest structural details of a material without damaging it. For instance, we can see how the porosity of heterogeneous materials changes in response to dissolution and precipitation reactions. In other words, we can examine the inner workings of materials in a completely non-intrusive way.” This kind of research involves high-tech instruments but also advanced analysis software – of which Andò strongly prefers the kind that’s free and open source.

In some cases, researchers would be better off using software that’s simpler than an AI and applied the right way

Edward Andò

3D acquisition and 3D image analysis are perfect examples of cross-disciplinary tools, since all scientific and engineering disciplines – from materials science and electronics to medicine, biology, geophysics and astrophysics – use them to observe matter on a micron scale. Andò, who speaks several languages, came to EPFL a year ago and naturally began working in civil engineering, including at the PIXE platform, which hosts “the big brother of the tomograph I used in Grenoble,” he explains. In fact, EPFL regularly shares new findings and methods with the University of Grenoble. “EPFL technicians came to Grenoble to learn the right procedures for observing shear tests,” says Andò. Never one to shy away from a challenge, he’s also worked on research projects with Lausanne University Hospital (CHUV), the University of Lausanne and labs at other EPFL schools.

Advice and an assessment of the pros and cons

Andò’s extensive experience with image capture and analysis methods is what led him to being appointed as head of the new hub, which currently employs three people. The goal is to create a single point of contact where all EPFL researchers, regardless of their particular field, can obtain expert advice and share their experience. “No question is too simple,” he says. The hub set up a chat feature on the Center for Imaging website in order to provide quick and personalized replies to users’ questions. The plan is to eventually build a community of experts that can answer questions for specific disciplines.

The use of imaging technology in general is expanding rapidly, as is the deployment of specialized image-analysis software – which can often leave non-experts scratching their heads. Andò’s hub gives EPFL researchers a list of the software currently available along with the pros and cons of each one so that users can make informed decisions. For example, deep-learning programs that have been trained to spot details in thousands of images in the blink of an eye are highly powerful, but they also eat up a lot of energy and usually require advanced training. “In some cases, researchers would be better off using software that’s simpler and applied the right way,” says Andò. Some form of training is important, however, for researchers to get a better understanding of image analysis, and to that end the hub held a first summer program for PhD students in early July.

Andò doesn’t intend to stop at human input in his efforts to give researchers the best advice possible. He hopes to one day teach machines to determine which software program is best for a given application and suggest the various factors that need to be taken into account. “Ideally, we’ll get to the point where researchers will just have to download a program and enter their data, without having to write a single line of code,” he says. “That means we’ll have to pay careful attention to the accuracy of the underlying code and test it thoroughly.”