The conjugate gradient method is one of the most important ideas in scientific computing, it is applied to solving linear systems of equations and nonlinear optimization problems. In this paper, based on a variant of the Hestenes-Stiefel (HS) method and the Polak-Ribiere-Polyak (PRP) method, two modified CG methods ( named ` MHS∗ and MPRP∗ ) are presented and analyzed. The search direction of the presented methods fulfills the sufficient descent condition at each iteration. We establish the global convergence of the proposed algorithms under normal assumptions and strong Wolfe line search. Preliminary elementary numerical experiment results are presented, demonstrating the promise and the effectiveness of the proposed methods. Finally, the proposed methods were further extended to solve the problem of the conditional model regression function.
Chaib, Y., & Abd elhamid, M. (2024). Global convergence of new conjugate gradient methods with application in conditional model regression function. Iranian Journal of Numerical Analysis and Optimization, (), -. doi: 10.22067/ijnao.2024.85389.1344
MLA
Yacine Chaib; Mehamdia Abd elhamid. "Global convergence of new conjugate gradient methods with application in conditional model regression function", Iranian Journal of Numerical Analysis and Optimization, , , 2024, -. doi: 10.22067/ijnao.2024.85389.1344
HARVARD
Chaib, Y., Abd elhamid, M. (2024). 'Global convergence of new conjugate gradient methods with application in conditional model regression function', Iranian Journal of Numerical Analysis and Optimization, (), pp. -. doi: 10.22067/ijnao.2024.85389.1344
VANCOUVER
Chaib, Y., Abd elhamid, M. Global convergence of new conjugate gradient methods with application in conditional model regression function. Iranian Journal of Numerical Analysis and Optimization, 2024; (): -. doi: 10.22067/ijnao.2024.85389.1344
Send comment about this article