New stepsizes for the gradient method

TitleNew stepsizes for the gradient method
Publication TypeJournal Article
Year of Publication2019
AuthorsSun, C, Liu, J-P
JournalOptim Lett
Date Published1/28/2019
Abstract

Gradient methods are famous for their simplicity and low complexity, which attract more and more attention for large scale optimization problems. A good stepsize plays an important role to construct an efficient gradient method. This paper proposes a new framework to generate stepsizes for gradient methods applied to convex quadratic function minimization problems. By adopting different criterions, we propose four new gradient methods. For 2-dimensional unconstrained problems with convex quadratic objective functions, we prove that the new methods either terminate in finite iterations or converge R-superlinearly; for n-dimensional problems, we prove that all the new methods converge R-linearly. Numerical experiments show that the new methods enjoy lower complexity and outperform the existing gradient methods.

DOI10.1007/s11590-019-01512-y