
IS004 - Advancements in Time Evolution Operators in Scientific Machine Learning
Keywords: machine learning, neural networks, Scientific machine learning, time evolution
The time dimension plays a pivotal role in many physical systems that are modeled through discrete dynamical systems or evolutionary partial differential equations. The combination of handling numerical approximation via time-advancing schemes and, at the same time, discovering differential equations from time series measurements poses significant challenges to scientific machine learning methods.
This minisymposium aims to explore operator learning approaches that treat time as an additional independent variable and specialized discrete or continuous-in-time architectures designed to model sequential data. Examples of such architectures include recurrent neural networks (RNNs) and neural ordinary differential equations (NODEs), respectively. We will also explore machine learning acceleration methods for numerical approximation, including hierarchical approaches, and other emerging methodologies that integrate data-driven approaches with classical numerical solvers.
The development of these methods faces several challenges. Balancing between computational efficiency and numerical accuracy remains an open question, particularly when handling multiscale phenomena or stiff equations. Hyperparameter tuning in these neural network architectures often requires domain-specific knowledge and can significantly impact both training stability and generalization performance. Moreover, extrapolation beyond training regimes continues to be problematic, with many models failing to capture long-term behavior or struggling to maintain physical consistency under previously unseen conditions.
We welcome original contributions integrating numerical analysis and machine learning that present new advances in modeling and simulation of complex time-dependent systems in various applications.