Optimizing conjugate gradient methods: A study on the parameter c in the GDSHS algorithm
Wassim Merchela1,2,3,4, Noureddine Benrabia5,4, Hamza Guebbai4
1University Mustapha Stambouli Mascara, Mascara, Algeria 2University Salah Boubnider Constantine 3, Constantine, Algeria 3Derzhavin Tambov State University, Tambov, Russia 4University 8 Mai 1945 Guelma, Guelma, Algeria 5University Mohamed Cherif Messaadia, Souk Ahras, Algeria
Keywords: conjugate gradient method, generalized conjugacy condition, symmetric techniques, global convergence, optimization performance
Abstract
Conjugate gradient methods represent a powerful class of optimization algorithms known for their efficiency and versatility. In this research, we delve into the optimization of the Generalized Descent Symmetrical Hestenes-Stiefel (GDSHS) algorithm by refining the parameter c, a critical factor in its performance. We employ both analytical and numerical methodologies to estimate the optimal range for c. Through comprehensive numerical experiments, we investigate the impact of different values of c on the algorithm's convergence behavior and computational efficiency. Comparative analyses are conducted between GDSHS variants with varying c values and established conjugate gradient methods such as Fletcher-Reeves (FR) and Polak-Ribière-Polyak (PRP+). Our findings underscore the significance of setting c =1, which significantly enhances the GDSHS algorithm's convergence properties and computational performance, positioning it as a competitive choice among state-of-the-art optimization techniques.
|