Why It Matters to Neuroscience How Fast We Can Compute
This research topic, published in Frontiers in Neuroinformatics, represents an update on the state of neuroscientific software, assesses the impact of the increased computational capabilities on scientific questions, and gives an outlook on future opportunities and challenges of computational neuroscience.
Twenty-odd years ago, the computational neuroscience community was facing a software crisis: software development was no longer progressing as expected and reproducibility declined. Several initiatives, perhaps most notably the International Neuroinformatics Coordinating Facility (INCF), were launched to help in the development of standards and best practices in the field and to try to establish a scientific approach to simulation technology.
At the same time, technological developments in computing were advancing at an unparalleled pace. In the span of 20 years the computational performance of the fastests supercomputers increased by a factor of 100,000 to above 1 ExaFLOPS. This technological advancement also required significant software adaptations, and simulators had to change substantially to take advantage of the increased computational power. “These changes went, however, mostly unnoticed from the point of view of the users”, notes Professor Markus Diesmann, Director at the Institute of Neuroscience and Medicine and Institute for Advanced Simulations at Juelich Research Centre.
A promising advance in the field, as observed in this Research Topic, is that we see a marked increase in the complexity of network models with more biologically relevant and more realistic models becoming the standard. “These full scale simulations are decisive as they remove many uncertainties on the scaling of emerging network phenomena with network size” explains Professor Felix Schürmann, Blue Brain Computing Director at EPFL. “In the other direction, several advances now delve into the subcellular realm, with membranes, and other biochemical phenomena being incorporated” he continues.
On the code development side, a clear shift towards open community-based development models is seen, where the source code of simulation engines as well as executable model descriptions are maintained in open repositories. The articles in this collection also address the issue of sustainability and portability of the scientific software. “This is all the more relevant as it has become apparent that neuroscientific software can have a life span of several decades and serve more than one specific scientific goal” remarks Omar Awile, High Performance Engineer at the EPFL Blue Brain Project.
James Courtney Knight, Research Software Engineering fellow at the University of Sussex, continues "Focusing on software sustainability can drive innovations in complex scientific software development, through robust continuous integration, testing, and documentation workflows, as modularity and composability become increasingly important."
A major theme addressed in the Research Topic and noted by the editors is the evolving hardware architecture landscape: “Several contributions focus on utilizing parallelism in latency-optimized CPUs and GPUs for accelerating simulation in popular neural network simulators and solving the challenges of handling massive parallelism and distributed computing in HPC clusters” observes Professor Thomas Nowotny, Head of the AI Research Group at the University of Sussex. “Others discuss the uncertainty of microchip scaling in the context of neural simulation, and the shift towards specialized components for AI applications and the potential for the use of neuromorphic hardware in HPC systems for conventional computing and complex biologically fit models” adds James B. Aimone, Distinguished Member of Technical Staff at Sandia National Laboratories.Finally a number of contributions explore the possibilities of using different simulation algorithms and techniques improving memory efficiency, and simulation speed. Of concern moreover, is the importance of developing benchmarks and benchmarking to quantify the performance of the diverse hardware architectures.
In this Research Topic, 145 contributing authors explore these various topics through 22 articles, giving an overview of the current and the likely future state of the simulation neuroscience field from the computing perspective. The editorial, meanwhile, summarizes the main trends to have emerged from the many contributions, and acts as a practical guide and pointer to the relevant research articles. “We’re thrilled to witness the active and influential role of this community in advancing the scientific pursuit of neuroscience, at all levels” enthuses Schürmann. “Additionally, we are pleased that the community united to demonstrate the significance of computing's evolution in enabling us to ask fundamental scientific questions, both now and in the future” he concludes.
Aimone, J.B., Awile, O. Diesmann, M., Knight J.C., Nowotny, T., Schürmann, F., (2023), Editorial: Neuroscience, Computing, Performance, and Benchmarks: Why It Matters to Neuroscience How Fast We Can Compute. Frontiers in Neuroinformatics Volume 17 - 2023
https://www.frontiersin.org/articles/10.3389/fninf.2023.1157418/full