Tutorial on rigorous sensor fusion

© 2022 EPFL

© 2022 EPFL

Two researchers of EPFL-TOPO together with Dr. Ismael Colomina from Geonumerics have given a tutorial on Dynamic Networks on June 5, 2022, at the ISPRS Congress in Nice.

Dynamic Networks (DN) present a rigorous, 1-step solution for registration of optical sensors with the support of navigation data. This approach has multiple advantages in comparison to the current multi-stage fusion methodology due to its ability to mitigate the effect of observation errors directly at the sensor level. In turn, this improves the quality and fidelity of obtained geospatial models, such as elevation models, orthophotos, 3D point-clouds or digital-twins in general.

The tutorial was organized by the ISPRS WG I/9 Integrated Sensor Orientation, Calibration, Navigation and Mapping presided by MER Jan Skaloud.

The online implementation of the Dynamic Network (https://odyn.epfl.ch) is supported by ENAC-IT4research.

References

[1] Cucci, D.A., "ODyN: an online dynamic network solver for photogrammetry and LiDAR geo-referencing". ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, V-1-2022, 153–159. PDF

For more information and for the scientific backgorund of Dynamic Networks and applications to navigation, mapping and sensor orientation in general, please refer to the following scientific publications:

[2] Cucci, D.A., Rehak. M, and Skaloud, J., "Bundle adjustment with raw inertial observations in UAV applications". ISPRS Journal of photogrammetry and remote sensing 130 (2017): 1-12. PDF

[3] Brun, A., Cucci, D.A., and Skaloud, J., "Lidar point–to–point correspondences for rigorous registration of kinematic scanning in dynamic networks." ISPRS Journal of Photogrammetry and Remote Sensing, 189 (2022): 185-200. PDF

[4] Mouzakkidou, K., Cucci, D.A., and Skaloud, J., "On the benefit of concurrent adjustment of active and passive optical sensors with GNSS & raw inertial data", ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, V-1-2022: 161-168. PDF