Seminar on Euclidean Fast Attention for Atomic Modeling by Thorben F.

© 2025 EPFL
Join us for a seminar by Thorben Frank on Friday, May 16, at 3:00 PM in MXG 312. Thorben is a PhD student in the Machine Learning Group at TU Berlin and the Berlin Institute for the Foundations of Learning and Data (BIFOLD). His work focuses on developing machine learning methods for quantum chemistry, combining geometric deep learning with message passing neural networks, to shape a new generation of machine learning force fields.
Long-range correlations are essential across numerous machine learning tasks, especially for data embedded in Euclidean space, where the relative positions and orientations of distant components are often critical for accurate predictions. Self-attention offers a compelling mechanism for capturing these global effects, but its quadratic complexity presents a significant practical limitation. This issue is particularly pronounced in computational chemistry, where the stringent efficiency requirements of machine learning force fields (MLFFs) often preclude the accurate modeling of long-range interactions. In principle, computationally more efficient fast attention formulations do exist, but they face a fundamental issue when applied to data embedded in Euclidean space: while encoding information about spatial arrangements is straightforward in standard quadratic-scaling attention, it is less clear how to achieve this in fast attention formulations.
In this seminar, Thorben will discuss his recent work where they introduce Euclidean Fast Attention (EFA), a fast attention-like mechanism designed for Euclidean data, which can be easily incorporated into existing model architectures. A core component of EFA are Euclidean Rotary Positional Encodings (ERoPE), which enable efficient encoding of spatial information while respecting essential physical symmetries. We empirically demonstrate that EFA effectively captures diverse long-range effects, enabling EFA-equipped MLFFs to accurately describe chemical interactions that conventional MLFFs fail to model correctly.
[1] Frank, J. Thorben, et al. "Euclidean fast attention: Machine learning global atomic representations at linear cost." arXiv preprint arXiv:2412.08541 (2024).