Seminar on uncertainty aware Bayesian regression by Dr. T. Swinburne

Join us for a seminar by Dr. T. Swinburne on Quantifying and Propagating Model-Form Uncertainty via Misspecification-Aware Bayesian Regression on Tuesday, 01.04.2025 at MA B1 524

All models are inherently misspecified—no single parameter choice can exactly match observations. This leads to model-form uncertainty, where parameters themselves have intrinsic error. In addition, finite capacity models such as polynomials or partially frozen neural networks are often underparametrized, i.e. the number of training data is much greater than the number of parameters, meaning epistemic uncertainties are minimal.

Dr. Swinburne will present recent work [1] on a novel approach which treats the true generalisation error, a misspecification-aware error measure for which the misspecification-blind log likelihood of Bayesian inference is only an (Hoeffding/Jensen) upper bound, closely analogous to the Gibbs-Bogoliubov bound (F<U). Whilst direct minimisation of the generalisation error is not tractable, they derive a novel condition any valid posterior must obey. The proposed method, POPS [1,2], efficiently evaluates this for linear (or linearized) models, enabling robust test error bounds with minimal computational overhead (≤2× standard Bayesian regression). Unlike conventional approaches, POPS assigns uncertainty directly to parameters, making it well-suited for propagating model-form uncertainty in multi-scale simulations [3,4,5].

References

[1] T. Swinburne, D. Perez, Mach. Learn.: Sci. Technol., 6, 015008 (2025)
[2] https://github.com/tomswinburne/POPS-Regression.git
[3] I. Maliyov, P. Grigorev, T. Swinburne, npj. Comput. Mater., 11, 22 (2025)
[4] D. Perez, A. P. A. Subramanyam, I. Maliyov, T. D. Swinburne, arXiv:2502.07104 (2025)
[5] T. D. Swinburne, C. Lapointe, M-C. Marinica, arXiv:2502.18191 (2025)