New acceleration strategies for gradient-based methods
Please login to view abstract download link
Gradient-based methods are widely used for solving large-scale nonlinear optimization problems, due to their computational simplicity combined with low cost per iteration and low memory requirements. In recent years, very efficient gradient-based approaches for constrained and unconstrained optimization have been designed, exploiting special strategies for steplength selection. In this talk, a new acceleration strategy will be presented, which is based on the idea of performing, at selected iterations, minimization steps along descent directions different from the negative gradient, or even in affine subspaces of low dimension. Indeed, composite search directions have been used in different contexts in the past, and some classical methods such as the Polyak Heavy Ball Method and the Conjugate Gradient method can be framed as gradient-based minimization methods equipped with composite search direction. %combining the current gradient with the previous step. The new approach proposed in this talk will be first analysed in the strictly convex quadratic framework, by inspecting the reductions of the linear and quadratic part of the objective function as a guiding principle to design line searches in acceleration steps. Then, we show how the proposed strategy can be incorporated into a gradient projection algorithm for convex quadratic programming subject to bound constraints. Finally, alternative ideas to extend the approach to general nonquadratic unconstrained problems will be provided, together with numerical tests on synthetic and real-world problems for the different proposed strategies.
