TY - JOUR
T1 - Limited memory gradient methods for unconstrained optimization
AU - Ferrandi, Giulia
AU - Hochstenbach, Michiel E.
PY - 2024/7/26
Y1 - 2024/7/26
N2 - The limited memory steepest descent method (LMSD, Fletcher, 2012) for unconstrained optimization problems stores a few past gradients to compute multiple stepsizes at once. We review this method and propose new variants. For strictly convex quadratic objective functions, we study the numerical behavior of different techniques to compute new stepsizes. In particular, we introduce a method to improve the use of harmonic Ritz values. We also show the existence of a secant condition associated with LMSD, where the approximating Hessian is projected onto a low-dimensional space. In the general nonlinear case, we propose two new alternatives to Fletcher’s method: first, the addition of symmetry constraints to the secant condition valid for the quadratic case; second, a perturbation of the last differences between consecutive gradients, to satisfy multiple secant equations simultaneously. We show that Fletcher’s method can also be interpreted from this viewpoint.
AB - The limited memory steepest descent method (LMSD, Fletcher, 2012) for unconstrained optimization problems stores a few past gradients to compute multiple stepsizes at once. We review this method and propose new variants. For strictly convex quadratic objective functions, we study the numerical behavior of different techniques to compute new stepsizes. In particular, we introduce a method to improve the use of harmonic Ritz values. We also show the existence of a secant condition associated with LMSD, where the approximating Hessian is projected onto a low-dimensional space. In the general nonlinear case, we propose two new alternatives to Fletcher’s method: first, the addition of symmetry constraints to the secant condition valid for the quadratic case; second, a perturbation of the last differences between consecutive gradients, to satisfy multiple secant equations simultaneously. We show that Fletcher’s method can also be interpreted from this viewpoint.
KW - 65F10
KW - 65F15
KW - 65K05
KW - 90C20
KW - 90C30
KW - Limited memory steepest descent
KW - Low-dimensional Hessian approximation
KW - Lyapunov equation
KW - Rayleigh–Ritz extraction
KW - Secant condition
KW - Unconstrained optimization
UR - http://www.scopus.com/inward/record.url?scp=85199749874&partnerID=8YFLogxK
U2 - 10.1007/s11075-024-01895-9
DO - 10.1007/s11075-024-01895-9
M3 - Article
AN - SCOPUS:85199749874
SN - 1017-1398
VL - XX
JO - Numerical Algorithms
JF - Numerical Algorithms
IS - X
ER -