New AI model transforms research on metal-organic frameworks

"A computer server transformed by MOFs (image creeted with an AI). Credit: 2023 EPFL/ Kevin Jablonka- CC-BY-SA 4.0

"A computer server transformed by MOFs (image creeted with an AI). Credit: 2023 EPFL/ Kevin Jablonka- CC-BY-SA 4.0

Researchers at EPFL and KAIST have developed a new AI model that significantly improves the understanding of metal-organic frameworks, promising materials for hydrogen storage and other applications.

How does an iPhone predict the next word you’re going to type in your messages? The technology behind this, and also at the core of many AI applications, is called a transformer; a deep-learning algorithm that detects patterns in datasets.

Now, researchers at EPFL and KAIST have created a transformer for Metal-Organic Frameworks (MOFs), a class of porous crystalline materials. By combining organic linkers with metal nodes, chemists can synthesize millions of different materials with potential applications in energy storage and gas separation.

The “MOFtransformer” is designed to be the ChatGPT for researchers that study MOFs. It’s architecture is based on an AI called Google Brain that can process natural language and forms the core of popular language models such as GPT-3, the predecessor to ChatGPT. The central idea behind these models is that they are pre-trained on a large amount of text, so when we start typing on an iPhone, for example, models like this “know” and autocomplete the most likely next word.

“We wanted to explore this idea for MOFs, but instead of giving a word suggestion, we wanted to have it suggest a property,” says Professor Berend Smit, who led the EPFL side of the project. “We pre-trained the MOFTransformer with a million hypothetical MOFs to learn their essential characteristics, which we represented as a sentence. The model was then trained to complete these sentences to give the MOF's correct characteristics.”

The researchers then fine-tuned the MOFTransformer for tasks related to hydrogen storage, such as the storage capacity of hydrogen, its diffusion coefficient, and the band gap of the MOF (an "energy barrier" that determines how easily electrons can move through a material).

The approach showed that the MOFTransformer could get results using far fewer data compared to conventional machine-learning methods, which require much more data. “Because of the pre-training, the MOFTtransformer knows already many of the general properties of MOFs; and because of this knowledge, we need less data to train for another property,” says Smit. Moreover, the same model could be used for all properties, while in conventional machine learning, a separate model must be developed for each application.

The MOFTransformer is a game-changer for the study of MOFs, providing faster results with less data and a more comprehensive understanding of the material. The researchers hope that the MOFTransformer will pave the way for the development of new MOFs with improved properties for hydrogen storage and other applications.


National Research Foundation of Korea (NRF)

Swiss National Supercomputing Center

Horizon 2020 (Accelerating CCS Technologies)

Swiss Federal Office of Energy (SFOE)


Yeonghun Kang, Hyunsoo Park, Berend Smit, Jihan Kim. MOFTransformer: A multi-modal pre-training transformer for universal transfer learning in metal-organic frameworks. Nature Machine Intelligence 13 March 2023. DOI: 10.1038/s42256-023-00628-2

Author: Nik Papageorgiou

Source: EPFL

This content is distributed under a Creative Commons CC BY-SA 4.0 license. You may freely reproduce the text, videos and images it contains, provided that you indicate the author’s name and place no restrictions on the subsequent use of the content. If you would like to reproduce an illustration that does not contain the CC BY-SA notice, you must obtain approval from the author.