2017_Improving The Performance Of Conjugate Gradient Method In Solving Unconstrainedoptimization Problems And Its Application
| Format: | General Document |
|---|
| _version_ | 1860798152159789056 |
|---|---|
| building | INTELEK Repository |
| collection | Online Access |
| collectionurl | https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 |
| copyright | Copyright©PWB2025 |
| country | Malaysia |
| date | 2018-02-04 |
| format | General Document |
| id | 16199 |
| institution | UniSZA |
| originalfilename | IMPROVING THE PERFORMANCE OF CONJUGATE GRADIENT METHOD IN SOLVING UNCONSTRAINEDOPTIMIZATION PROBLEMS AND ITS APPLICATION (PHD_2017).pdf |
| person | Nur Hamizah Binti Abdul Ghani |
| recordtype | oai_dc |
| resourceurl | https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16199 |
| sourcemedia | Server storage Scanned document |
| spelling | 16199 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16199 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 General Document Malaysia Library Staff (Top Management) Library Staff (Management) Library Staff (Support) Terengganu Faculty of Informatics & Computing English application/pdf 1.5 Server storage Scanned document Universiti Sultan Zainal Abidin UniSZA Private Access UNIVERSITI SULTAN ZAINAL ABIDIN SAMBox 2.3.4; modified using iTextSharp™ 5.5.10 ©2000-2016 iText Group NV (AGPL-version) Copyright©PWB2025 2018-02-04 247 Conjugate gradient methods IMPROVING THE PERFORMANCE OF CONJUGATE GRADIENT METHOD IN SOLVING UNCONSTRAINEDOPTIMIZATION PROBLEMS AND ITS APPLICATION (PHD_2017).pdf Nur Hamizah Binti Abdul Ghani 2017_Improving The Performance Of Conjugate Gradient Method In Solving Unconstrainedoptimization Problems And Its Application The conjugate gradient (CG) method is the best in iterative methods due to its simple algorithm, low memory storage, and good convergence analysis. However, a major problem of the existing CG methods is that it can be very slow on the certain type of unconstrained optimization problems. Therefore, there is needed to improve the efficiency of those methods based on the number of iterations and central processing times per unit. This research proposes two new CG methods as NRM and NRM1 methods, based on the researcher’s name (Nur Hamizah, Rivaie and Mustafa) and classified into the groups of classical and hybrid CG methods, respectively. Both approaches are tested based on the exact and inexact line search. Theoretical proofs are shown that both new CG methods possess the sufficient descent and global convergence properties. The efficiency of the new CG methods are studied by testing on 30 standard test problems of unconstrained optimization functions, with a total of 109 problems based on four different initial points ranging from small dimensions to large dimensions problems utilizing MatlabR2012 programming. The NRM and NRM1 methods are compared with the existing CG methods, which are Hestenes-Stiefel (HS), Rivaie-Mustafa-Ismail-Leong (RMIL), Touati-Ahmed-Storey (TS) and Jinbao-Han-Jiang (JHJ). Comparison based on performance profile shows that both new CG methods are efficient in their class of CG. From the exact line search, the NRM1 method shows the highest successful percentage to solve test functions in group of hybrid CG compared to TS and JHJ methods. Meanwhile in group of classical CG, the NRM and RMIL method show the same highest successful percentage to solve the test functions. From the inexact line search, the NRM1 and NRM methods show the highest successful percentage to solve test functions compared to the TS, JHJ, HS and RMIL methods. Both new methods are able to lessen the number of iterations and central processing times per unit. Moreover, the performance of CG methods in inexact line search is better than the performance in exact line search. Besides that, the numerical result also shows that new CG methods have capability to be applied in regression analysis problem. Thus, both of the new methods show promising results to be implemented in further study Dissertations, Academic Unconstrained Optimization Conjugate Gradient Methods Mathematical Optimization Thesis |
| spellingShingle | 2017_Improving The Performance Of Conjugate Gradient Method In Solving Unconstrainedoptimization Problems And Its Application |
| state | Terengganu |
| subject | Conjugate gradient methods Dissertations, Academic |
| summary | The conjugate gradient (CG) method is the best in iterative methods due to its simple algorithm, low memory storage, and good convergence analysis. However, a major problem of the existing CG methods is that it can be very slow on the certain type of unconstrained optimization problems. Therefore, there is needed to improve the efficiency of those methods based on the number of iterations and central processing times per unit. This research proposes two new CG methods as NRM and NRM1 methods, based on the researcher’s name (Nur Hamizah, Rivaie and Mustafa) and classified into the groups of classical and hybrid CG methods, respectively. Both approaches are tested based on the exact and inexact line search. Theoretical proofs are shown that both new CG methods possess the sufficient descent and global convergence properties. The efficiency of the new CG methods are studied by testing on 30 standard test problems of unconstrained optimization functions, with a total of 109 problems based on four different initial points ranging from small dimensions to large dimensions problems utilizing MatlabR2012 programming. The NRM and NRM1 methods are compared with the existing CG methods, which are Hestenes-Stiefel (HS), Rivaie-Mustafa-Ismail-Leong (RMIL), Touati-Ahmed-Storey (TS) and Jinbao-Han-Jiang (JHJ). Comparison based on performance profile shows that both new CG methods are efficient in their class of CG. From the exact line search, the NRM1 method shows the highest successful percentage to solve test functions in group of hybrid CG compared to TS and JHJ methods. Meanwhile in group of classical CG, the NRM and RMIL method show the same highest successful percentage to solve the test functions. From the inexact line search, the NRM1 and NRM methods show the highest successful percentage to solve test functions compared to the TS, JHJ, HS and RMIL methods. Both new methods are able to lessen the number of iterations and central processing times per unit. Moreover, the performance of CG methods in inexact line search is better than the performance in exact line search. Besides that, the numerical result also shows that new CG methods have capability to be applied in regression analysis problem. Thus, both of the new methods show promising results to be implemented in further study |
| title | 2017_Improving The Performance Of Conjugate Gradient Method In Solving Unconstrainedoptimization Problems And Its Application |
| title_full | 2017_Improving The Performance Of Conjugate Gradient Method In Solving Unconstrainedoptimization Problems And Its Application |
| title_fullStr | 2017_Improving The Performance Of Conjugate Gradient Method In Solving Unconstrainedoptimization Problems And Its Application |
| title_full_unstemmed | 2017_Improving The Performance Of Conjugate Gradient Method In Solving Unconstrainedoptimization Problems And Its Application |
| title_short | 2017_Improving The Performance Of Conjugate Gradient Method In Solving Unconstrainedoptimization Problems And Its Application |
| title_sort | 2017_improving the performance of conjugate gradient method in solving unconstrainedoptimization problems and its application |