New stepsizes for the gradient method
报告人:孙聪 副教授, 北京邮电大学理学院数学科学学院 时间:2019年10月25日15:40
邀请人: 姜波 副教授
摘要:It is popular to solve large scale problems by gradient methods. We propose a new framework combining Cauchy steps with fixed step lengths, to update stepsizes in a cyclic way. Four different gradient algorithms are proposed with various fixed step lengths. For 2-dimensional convex quadratic function minimization problems, the algorithms either terminate in finite iterations or converges superlinearly; for n-dimensional problems, they all converge linearly. Moreover, we propose new stepsizes based on the analysis to find the optimal solution in 5 iterations for 3-dimensional convex quadratic function minimization problems. By plugging the new stepsizes into the proposed cyclic framework, we have new gradient methods, which guarantee finite terminations for 3-dimensional problems, and converge R-linearly for general n-dimensional problems. Numerical tests show the superior performance of the proposed method over the states of the art.