2017_New Conjugate Gradient and Its Hybrid Method for Unconstrained Optimization Problems
| Format: | General Document |
|---|
| _version_ | 1860798154476093440 |
|---|---|
| building | INTELEK Repository |
| collection | Online Access |
| collectionurl | https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 |
| copyright | Copyright©PWB2025 |
| country | Malaysia |
| date | 2018-02-04 |
| format | General Document |
| id | 16207 |
| institution | UniSZA |
| originalfilename | 16207_0b7bec0804d310e.pdf |
| person | Nurul 'Aini Binti Harun |
| recordtype | oai_dc |
| resourceurl | https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16207 |
| sourcemedia | Server storage Scanned document |
| spelling | 16207 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16207 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 General Document Malaysia Library Staff (Top Management) Library Staff (Management) Library Staff (Support) Terengganu Faculty of Informatics & Computing English application/pdf 1.5 Server storage Scanned document Universiti Sultan Zainal Abidin UniSZA Private Access UNIVERSITI SULTAN ZAINAL ABIDIN SAMBox 2.3.4; modified using iTextSharp™ 5.5.10 ©2000-2016 iText Group NV (AGPL-version) Copyright©PWB2025 2018-02-04 272 Unconstrained optimization Conjugate gradient methods Conjugate gradient methods 16207_0b7bec0804d310e.pdf Nurul 'Aini Binti Harun Hybrid Optimization Methods 2017_New Conjugate Gradient and Its Hybrid Method for Unconstrained Optimization Problems There are several methods for solving unconstrained optimization functions. Of these methods, the ones most commonly used are the conjugate gradient (CG) and quasi Newton (QN) method. The CG approach is suitable for large scale problems because of its low memory requirement. However, most developments in the CG method involve complex equation for its search direction, making it difficult to implement. Some of them are also not globally convergent. The QN method is efficient for solving problems of small and medium scale though it becomes increasingly slow when applied to large scale problems. This is due to the n n matrix in its search direction formula that leads to high memory requirement. In this study, a globally convergent CG coefficient is presented. The new algorithm is based on classical CG which is simple and also efficient in solving unconstrained optimization problems. Next, to improve the efficiency of the QN method, a hybrid search direction of QN and CG method is proposed. The CG method is chosen for its computational capability in solving large scale problems and good convergence properties. The CG coefficient proposed in this thesis is used in the hybrid search direction. The new algorithms are shown to possess sufficient descent and global convergence properties when used with strong Wolfe line search. For the numerical tests, twenty-five standard functions are used with varying dimension levels and starting points. The test measures the performance of the solvers in terms of iteration number and CPU time. All of the computation process is performed by Matlab r2012 programme. The performance of the new CG method is compared with some of the current CG methods while the new hybrid QN method is compared with the original QN and two existing hybrid QN methods. An application in data fitting is also included to prove the applicability of the new approaches in real life problem. Based on the numerical results, the proposed CG has the highest number of test problems solved. It also has the lowest number of iteration and CPU time compared to other CG methods tested. Next, the new hybrid QN method show similar results when compared with other QN based solvers. The proposed algorithms also proved to be applicable for data fitting. The new CG and hybrid QN methods have shown great efficiency in solving unconstrained optimization test problems and the real life problem. Moreover, both approaches possess the sufficient descent and global convergence property as demonstrated by the theoretical and numerical proofs. Dissertations, Academic Thesis |
| spellingShingle | 2017_New Conjugate Gradient and Its Hybrid Method for Unconstrained Optimization Problems |
| state | Terengganu |
| subject | Conjugate gradient methods Dissertations, Academic |
| summary | There are several methods for solving unconstrained optimization functions. Of these methods, the ones most commonly used are the conjugate gradient (CG) and quasi Newton (QN) method. The CG approach is suitable for large scale problems because of its low memory requirement. However, most developments in the CG method involve complex equation for its search direction, making it difficult to implement. Some of them are also not globally convergent. The QN method is efficient for solving problems of small and medium scale though it becomes increasingly slow when applied to large scale problems. This is due to the n n matrix in its search direction formula that leads to high memory requirement. In this study, a globally convergent CG coefficient is presented. The new algorithm is based on classical CG which is simple and also efficient in solving unconstrained optimization problems. Next, to improve the efficiency of the QN method, a hybrid search direction of QN and CG method is proposed. The CG method is chosen for its computational capability in solving large scale problems and good convergence properties. The CG coefficient proposed in this thesis is used in the hybrid search direction. The new algorithms are shown to possess sufficient descent and global convergence properties when used with strong Wolfe line search. For the numerical tests, twenty-five standard functions are used with varying dimension levels and starting points. The test measures the performance of the solvers in terms of iteration number and CPU time. All of the computation process is performed by Matlab r2012 programme. The performance of the new CG method is compared with some of the current CG methods while the new hybrid QN method is compared with the original QN and two existing hybrid QN methods. An application in data fitting is also included to prove the applicability of the new approaches in real life problem. Based on the numerical results, the proposed CG has the highest number of test problems solved. It also has the lowest number of iteration and CPU time compared to other CG methods tested. Next, the new hybrid QN method show similar results when compared with other QN based solvers. The proposed algorithms also proved to be applicable for data fitting. The new CG and hybrid QN methods have shown great efficiency in solving unconstrained optimization test problems and the real life problem. Moreover, both approaches possess the sufficient descent and global convergence property as demonstrated by the theoretical and numerical proofs. |
| title | 2017_New Conjugate Gradient and Its Hybrid Method for Unconstrained Optimization Problems |
| title_full | 2017_New Conjugate Gradient and Its Hybrid Method for Unconstrained Optimization Problems |
| title_fullStr | 2017_New Conjugate Gradient and Its Hybrid Method for Unconstrained Optimization Problems |
| title_full_unstemmed | 2017_New Conjugate Gradient and Its Hybrid Method for Unconstrained Optimization Problems |
| title_short | 2017_New Conjugate Gradient and Its Hybrid Method for Unconstrained Optimization Problems |
| title_sort | 2017_new conjugate gradient and its hybrid method for unconstrained optimization problems |