Forever young: built infrastructure that lasts
The safety of aging bridges and other built structures is difficult to assess, limiting options when they reach the end of their planned service life. A new method could encourage rejuvenation of infrastructure, potentially indefinitely.
Civil engineers like to play it safe, which, given the types of structures they build, is usually a good thing. You would not want to cross the Golden Gate Bridge or travel through the new Gotthard tunnel if you didn’t trust the engineering. But too much prudence has its downsides, such as unnecessary rehabilitation work or even the destruction of old, but still safe structures. Because of their complexity, it is almost impossible to make an accurate diagnosis of aging infrastructure using conventional engineering models. Combining sensor data and an iterative multiple-model selection process, researchers at EPFL have developed an approach that could contribute to extending the lifespan of built structures, which they tested on a road bridge in New Jersey, USA. Their work is published in a recent edition of the journal Engineering Structures.
According to Ian Smith, the principal investigator behind the study, to some degree, models made based on the design of a structure are always wrong, and the devil is in the details. “When engineers model a bridge, they always have to make assumptions. Should they include the sidewalk? What about the stiffness of joints and bearings? These assumptions bias models so that they consistently reproduce the same error-bias direction. We call these non-random errors systematic errors. While random errors could cancel each other out, systematic errors are less likely to do so, since errors that point in one direction would need to have exactly the same value as errors pointing in the other.”
If carefully designing and fine-tuning a model doesn’t work, then what does? Smith and his co-author Romain Pasquier argue that finding models that accurately describe the behavior of a structure requires comparing measurement data with many models and dropping the ones that don’t pass.
Disqualifying poor performers
Smith and Pasquier applied their method to diagnose the structural health of an existing eight-lane road bridge in New Jersey, USA. In a first step, the bridge was equipped with displacement sensors. Then, up to a dozen trucks were parked on the bridge in a variety of configurations and the bridge’s response was recorded. By comparing the measured displacements to those predicted by a large number of computer models, the researchers were able to determine which, if any, of the models, were promising and eliminated the rest.
Guided by the measured data and their expertise, the researchers continued to propose alternative model types and either improve upon them or reject them. “When all instances of a model class are falsified, our approach provides unique support for model-class exploration. This helps discover, for example, previously unknown levels of reserve capacity that can be used for subsequent decision making related to extension, repair, improvement and replacement,” explains Smith. After repeating the process four times, the researchers ended up with models that reliably replicated the observed behavior of the actual bridge.
A trillion dollar question
“Replacing all aging infrastructure is not cost effective and not sustainable,” says Smith, who estimates that there is, at least, one-trillion dollars’ worth of reserve capacity in existing infrastructure. The model falsification methodology he is proposing has applications beyond extending the lifespan of bridges. “This is a general data-interpretation approach that is applicable to all sensed systems where modeling uncertainty is high,” he says. In his research group, model falsification has been applied to optimizing the placement of sensors in water distribution networks and operating a self-deploying bridge among many other projects.