Implicit Neural Field Reconstruction on Complex Shapes from Scattered Data
Please login to view abstract download link
In many engineering and medical applications, reconstructing physical fields and domain geometries from noisy, scattered data collected by local sensors is a critical task. Both the statistical reconstruction of distributed quantities and the simulation of physical processes (typically modeled by partial differential equations) depend heavily on accurate geometry reconstruction. Simulation approaches usually combine intensive pre-processing of geometrical data to obtain mesh representations and the use of computationally intensive numerical solvers to approximate the solution field. In this talk, we introduce two novel geometry reconstruction approaches that only require point cloud representations and are suitable for high- and low-density data scenarios and demonstrate the application of these methods on reconstruction of cardiac geometries and surrogate modelling. In low density and noisy single-subject scenario, we develop a method that only exploits surface-level point measurements. We design a custom loss function that combines fitting and regularization terms to improve model generalization. In the high-density data scenario, we propose a supervised reconstruction pipeline based on the DeepSDF architecture, which integrates an embedding model and a regression network to learn simultaneously the shape of multiple objects. Each geometry is represented by a latent code that encodes the shape information, allowing for the generation of realistic synthetic geometries by sampling the latent space. We test the validity of this method for inferring anisotropic passive mechanics on the patient specific and synthetically generated geometries using a second neural network. This highlights the potential of shape reconstruction approaches for developing new shape-informed surrogate modelling techniques which are based on Neural Networks.
