
IS010 - From Classical to Data-Driven: Modern Trends in Optimization and Regularization (Part 1)
Keywords: machine learning, optimization, regularization
The increasing complexity of modern scientific and engineering problems imposes the use of scalable and efficient numerical optimization techniques. As the problem size grows bigger, the computational requirements become more demanding, and classical optimization techniques must be adapted to such high-dimensional settings, e.g. by combining them with inexact or stochastic strategies [1]. At the same time, addressing these problems requires the adoption of regularization strategies in order to deal with the presence of ill-posedness and uncertainty [2]. In recent years, hybrid approaches that blend optimization and regularization with learning-based techniques have been successfully proposed, including data-driven optimization solvers, neural representations of classical variational models or the incorporation of learned priors into physical models. The integration of deep learning should not only accelerate the optimization procedures but also improve the accuracy of the approximate solution from the data at hand. Current research on these methods focuses on improving their theoretical analysis to better understand their underlying principles, as well as enhancing their effectiveness in practical applications.
This minisymposium brings together researchers to present and discuss recent contributions that lie at the intersection of numerical optimization, regularization and data-driven methodologies. Innovative approaches will be discussed either from a theoretical and/or computational point of view, shedding light on possible future directions in these research fields.
[1] L. Bottou, F. E. Curtis, and J. Nocedal: Optimization Methods for Large-Scale Machine Learning, SIAM Review 60(2) (2018): 223-311.
[2] M. Benning and M. Burger: Modern Regularization Methods for Inverse Problems, Acta Numerica 27 (2018): 1-111