2015_New Family of Conjugate Gradient Methods with Sufficient Descent Condition and Global Convergence For Unconstrained Optimizations

Bibliographic Details
Format: General Document
_version_ 1860798155494260736
building INTELEK Repository
collection Online Access
collectionurl https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3
copyright Copyright©PWB2025
country Malaysia
date 2015-09-08
format General Document
id 16210
institution UniSZA
originalfilename NEW FAMILY OF CONJUGATE GRADIENT METHODS WITH SUFFICIENT DESCENT CONDITION AND GLOBAL CONVERGENCE FOR UNCONSTRAINED OPTIMIZATIONS (PHD_2015).pdf
person Ibrahim Bin Jusoh
recordtype oai_dc
resourceurl https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16210
sourcemedia Server storage
Scanned document
spelling 16210 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16210 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 General Document Malaysia Library Staff (Top Management) Library Staff (Management) Library Staff (Support) Terengganu Faculty of Informatics & Computing English application/pdf 1.5 Server storage Scanned document Universiti Sultan Zainal Abidin UniSZA Private Access UNIVERSITI SULTAN ZAINAL ABIDIN SAMBox 2.3.4; modified using iTextSharp™ 5.5.10 ©2000-2016 iText Group NV (AGPL-version) Copyright©PWB2025 Conjugate gradient methods 2015-09-08 244 NEW FAMILY OF CONJUGATE GRADIENT METHODS WITH SUFFICIENT DESCENT CONDITION AND GLOBAL CONVERGENCE FOR UNCONSTRAINED OPTIMIZATIONS (PHD_2015).pdf Ibrahim Bin Jusoh Sufficient Descent Condition 2015_New Family of Conjugate Gradient Methods with Sufficient Descent Condition and Global Convergence For Unconstrained Optimizations Conjugate gradient methods are a family of significance methods for solving of large-scale unconstrained optimization problems. This is due to both the simplicity of its algorithm and low memory requirement. A lot of efforts have been done to improve those methods since 1964 when the work of Fletcher and Reeve had opened the way to nonlinear conjugate gradient methods. In this research two new simple modifications of conjugate gradient coefficient have been proposed. Both algorithms satisfy sufficient descent conditions and global convergence for exact line search and strong Wolfe line search. The convergence rate is super linear and its search directions fulfill the angle conditions. Based on the fact that a proof of global convergence for an algorithm does not ensure that it is an efficient method, then the new is tested with twenty eight standard optimization test problems using MATLAB version 7.10.0 (R 2010a) subroutine programming and compared with five well- known conjugate gradient methods, which are Fletcher and Reeves (FR), Polak-Ribiere-Polyak (PRP), Hestenes and Steifel (HS), Wei-Yao-Liu (WYL) and Dai and Yuan (DY). Numerical results based on number of iterations and CPU time are analyzed and presented using performance profile of Dolan and Moore. For every test function four initial points are selected, some are close to the solution and some are further away. It is found out that both new formulas perform better than the other formulas for exact line search. However IMR1 performs better than the other formulas for strong Wolfe line search. Dissertations, Academic Unconstrained Optimization Conjugate Gradient Methods Thesis
spellingShingle 2015_New Family of Conjugate Gradient Methods with Sufficient Descent Condition and Global Convergence For Unconstrained Optimizations
state Terengganu
subject Conjugate gradient methods
Dissertations, Academic
summary Conjugate gradient methods are a family of significance methods for solving of large-scale unconstrained optimization problems. This is due to both the simplicity of its algorithm and low memory requirement. A lot of efforts have been done to improve those methods since 1964 when the work of Fletcher and Reeve had opened the way to nonlinear conjugate gradient methods. In this research two new simple modifications of conjugate gradient coefficient have been proposed. Both algorithms satisfy sufficient descent conditions and global convergence for exact line search and strong Wolfe line search. The convergence rate is super linear and its search directions fulfill the angle conditions. Based on the fact that a proof of global convergence for an algorithm does not ensure that it is an efficient method, then the new is tested with twenty eight standard optimization test problems using MATLAB version 7.10.0 (R 2010a) subroutine programming and compared with five well- known conjugate gradient methods, which are Fletcher and Reeves (FR), Polak-Ribiere-Polyak (PRP), Hestenes and Steifel (HS), Wei-Yao-Liu (WYL) and Dai and Yuan (DY). Numerical results based on number of iterations and CPU time are analyzed and presented using performance profile of Dolan and Moore. For every test function four initial points are selected, some are close to the solution and some are further away. It is found out that both new formulas perform better than the other formulas for exact line search. However IMR1 performs better than the other formulas for strong Wolfe line search.
title 2015_New Family of Conjugate Gradient Methods with Sufficient Descent Condition and Global Convergence For Unconstrained Optimizations
title_full 2015_New Family of Conjugate Gradient Methods with Sufficient Descent Condition and Global Convergence For Unconstrained Optimizations
title_fullStr 2015_New Family of Conjugate Gradient Methods with Sufficient Descent Condition and Global Convergence For Unconstrained Optimizations
title_full_unstemmed 2015_New Family of Conjugate Gradient Methods with Sufficient Descent Condition and Global Convergence For Unconstrained Optimizations
title_short 2015_New Family of Conjugate Gradient Methods with Sufficient Descent Condition and Global Convergence For Unconstrained Optimizations
title_sort 2015_new family of conjugate gradient methods with sufficient descent condition and global convergence for unconstrained optimizations