# An adaptive descent extension of the Polak–Rebière–Polyak conjugate gradient method based on the concept of maximum magnification

Document Type : Research Article

Authors

Department of Mathematics, Semnan University, P.O. Box: 35195–363, Semnan, Iran.

Abstract

Recently, a one-parameter extension of the Polak–Rebière–Polyak method has been suggested, having acceptable theoretical features and promising numerical behavior. Here, based on an eigenvalue analysis on the method with the aim of avoiding a search direction in the direction of the maximum magnification by a symmetric version of the search direction matrix, an adaptive formula for computing parameter of the method is proposed. Under standard assumptions, the given formula ensures the sufficient descent property and guarantees the global convergence of the method. Numerical experiments are done on a collection of CUTEr test problems. They show practical effectiveness of the suggested formula for the parameter of the method.

Keywords

#### References

1. Abubakar, A.B., Kumam, P. and Awwal, A.M. Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery, Result. Appl. Math. 4 (2019), 100069.
2. Aminifard, Z. and Babaie-Kafaki, S. An optimal parameter choice for the Dai–Liao family of  conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix, 4OR, 17(3) (2019), 317–330.
3. Andrei, N. A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization, Optimization, 60(12) (2011), 1457–1471.
4. Babaie-Kafaki, S. and Ghanbari, R. A descent extension of the Polak–Ribière–Polyak conjugate gradient method, Comput. Math. Appl. 68(12)(2014), 2005–2011.
5. Dai, Y.H., Han, J.Y., Liu, G.H., Sun, D.F., Yin, H.X. and Yuan, Y.X. Convergence properties of nonlinear conjugate gradient methods, SIAM J. Optim. 10(2) (1999), 348–358.
6. Dai, Y. H. and Liao, L. Z. New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43(1) (2001), 87–101.
7. Dolan, E.D. and Moré, J.J. Benchmarking optimization software with performance profiles, Math. Programming (Ser. A), 91(2) (2002), 201–213.
8. Gould, N.I.M., Orban, D. and Toint, Ph.L. CUTEr: A constrained and unconstrained testing environment, ACM Trans. Math. Software, 29(4)(2003), 373–394.
9. Hager, W.W. and Zhang, H. A survey of nonlinear conjugate gradient methods, Pac. J. Optim. 2(1) (2006), 35–58.
10. Hager, W. W. and Zhang, H. Algorithm 851: CG-Descent, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Software, 32(1) (2006), 113–137.
11. Heravi, A.R. and Hodtani, G.A. A new correntropy-based conjugate gradient backpropagation algorithm for improving training in neural networks, IEEE Trans. Neural Netw. Learn. Syst. 29(12) (2018), 6252–6263.
12. Lin, J. and Jiang, C., An improved conjugate gradient parametric detection based on space-time scan, Signal Process. 169 (2020), 107412.
13. Nocedal, J. and Wright S.J. Numerical optimization, Springer, New York, 2006.
14. Sun, W. and Yuan, Y.X. Optimization theory and methods: Nonlinear programming, Springer, New York, 2006.
15. Watkins, D.S. Fundamentals of matrix computations, John Wiley and Sons, New York, 2002.
16. Yuan, G., Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optim. Lett. 3(1)(2009), 11–21.
17. Yuan, G., Lu, J. and Wang, Z. The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems, Appl. Numer. Math. 152 (2020), 1–11.
18. Zhang, L., Zhou, W. and Li, D.H. A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence, IMA J. Numer. Anal. 26(4) (2006), 629–640.