2019_New Hybrid Quasi-Newton Methods For Solving Unconstrained Optimization Problems
| Format: | General Document |
|---|
| _version_ | 1860798156197855232 |
|---|---|
| building | INTELEK Repository |
| collection | Online Access |
| collectionurl | https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 |
| copyright | Copyright©PWB2025 |
| country | Malaysia |
| date | 2019-04-11 |
| format | General Document |
| id | 16213 |
| institution | UniSZA |
| originalfilename | NEW HYBRID QUASI-NEWTON METHODS FOR SOLVING UNCONSTRAINED OPTIMIZATION PROBLEMS (PHD_2019).pdf |
| person | Wan Farah Hanan Binti Wan Osman |
| recordtype | oai_dc |
| resourceurl | https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16213 |
| sourcemedia | Server storage Scanned document |
| spelling | 16213 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16213 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 General Document Malaysia Library Staff (Top Management) Library Staff (Management) Library Staff (Support) Terengganu Faculty of Informatics & Computing English application/pdf 1.5 Server storage Scanned document Universiti Sultan Zainal Abidin UniSZA Private Access UNIVERSITI SULTAN ZAINAL ABIDIN SAMBox 2.3.4; modified using iTextSharp™ 5.5.10 ©2000-2016 iText Group NV (AGPL-version) 169 Copyright©PWB2025 Hybrid Optimization Methods 2019-04-11 NEW HYBRID QUASI-NEWTON METHODS FOR SOLVING UNCONSTRAINED OPTIMIZATION PROBLEMS (PHD_2019).pdf Wan Farah Hanan Binti Wan Osman Quasi-Newton Methods Quasi-Newton Methods 2019_New Hybrid Quasi-Newton Methods For Solving Unconstrained Optimization Problems Quasi Newton (QN) method is one of the commonly used tools in solving unconstrained optimization problems. One of the main advantages of QN method is its superlinear convergence rate which is very fast. However, this method often yields high number of CPU time when used for solving large scale problems due to the presence of Hessian matrix in its formula which has high memory requirement. Moreover, the convergence property of QN algorithm is local, hence the optimal point obtained might not be the most minimum point for the tested problem. In this study, the QN method is combined with the conjugate gradient (CG) method to produce new hybrid search directions. The CG method is chosen due to its global convergence properties and its low memory requirement. These properties enable the CG method to solve large scale problems with better efficiency. To overcome these problems, a new CG method for solving unconstrained problems has been proposed. They are denoted as WAM (Wan, Asrul and Mustafa). This WAM method is then combined with the QN method to produce two new hybrid search directions which are QN-WAM and QN-WAM+. The Davidon-Fletcher-Powell (DFP) and Broyden-Fletcher-Goldfarb Shanno (BFGS) update formulas are used to determine the approximate of the inverse Hessian for each of the new hybrid QN-WAM and QN-WAM+ methods. These new methods are referred as the DFP-WAM, BFGS-WAM, DFP-WAM+ and BFGS WAM+ methods. They are all tested along with the original DFP and BFGS methods by using twenty-six standard optimization functions from small scale to large scale. For each functions, four initial points are selected, starting from a point near the optimal point to a point located far from it. All of the computations are conducted through Matlab software. The effectiveness of the proposed search directions is analyzed by using performance profile introduced by Dolan and More. Based on the numerical test results, the new algorithms are more efficient compared with the ordinary DFP and BFGS methods in terms of number of iterations and CPU time. In general, the proposed algorithms show the highest percentage of problems solved. The hybrid DFP-WAM and BFGS-WAM methods solve 96.10% and 97.27% of the tested problem respectively while the DFP-WAM+ and BFGS-WAM+ methods successfully solve 100% of the tested problems. In comparison, the DFP method and the BFGS method only solve 93.36% and 93.75% of the tested problems, respectively. The new hybrid methods with DFP and BFGS update have also been proven to fulfil sufficient descent condition and possess global convergence properties. All the proposed methods have shown the best efficiency in solving the selected optimization test functions compared with the other tested QN methods. They are also theoretically proven to be globally convergent. Dissertations, Academic Unconstrained Optimization Thesis |
| spellingShingle | 2019_New Hybrid Quasi-Newton Methods For Solving Unconstrained Optimization Problems |
| state | Terengganu |
| subject | Quasi-Newton Methods Dissertations, Academic |
| summary | Quasi Newton (QN) method is one of the commonly used tools in solving unconstrained optimization problems. One of the main advantages of QN method is its superlinear convergence rate which is very fast. However, this method often yields high number of CPU time when used for solving large scale problems due to the presence of Hessian matrix in its formula which has high memory requirement. Moreover, the convergence property of QN algorithm is local, hence the optimal point obtained might not be the most minimum point for the tested problem. In this study, the QN method is combined with the conjugate gradient (CG) method to produce new hybrid search directions. The CG method is chosen due to its global convergence properties and its low memory requirement. These properties enable the CG method to solve large scale problems with better efficiency. To overcome these problems, a new CG method for solving unconstrained problems has been proposed. They are denoted as WAM (Wan, Asrul and Mustafa). This WAM method is then combined with the QN method to produce two new hybrid search directions which are QN-WAM and QN-WAM+. The Davidon-Fletcher-Powell (DFP) and Broyden-Fletcher-Goldfarb Shanno (BFGS) update formulas are used to determine the approximate of the inverse Hessian for each of the new hybrid QN-WAM and QN-WAM+ methods. These new methods are referred as the DFP-WAM, BFGS-WAM, DFP-WAM+ and BFGS WAM+ methods. They are all tested along with the original DFP and BFGS methods by using twenty-six standard optimization functions from small scale to large scale. For each functions, four initial points are selected, starting from a point near the optimal point to a point located far from it. All of the computations are conducted through Matlab software. The effectiveness of the proposed search directions is analyzed by using performance profile introduced by Dolan and More. Based on the numerical test results, the new algorithms are more efficient compared with the ordinary DFP and BFGS methods in terms of number of iterations and CPU time. In general, the proposed algorithms show the highest percentage of problems solved. The hybrid DFP-WAM and BFGS-WAM methods solve 96.10% and 97.27% of the tested problem respectively while the DFP-WAM+ and BFGS-WAM+ methods successfully solve 100% of the tested problems. In comparison, the DFP method and the BFGS method only solve 93.36% and 93.75% of the tested problems, respectively. The new hybrid methods with DFP and BFGS update have also been proven to fulfil sufficient descent condition and possess global convergence properties. All the proposed methods have shown the best efficiency in solving the selected optimization test functions compared with the other tested QN methods. They are also theoretically proven to be globally convergent. |
| title | 2019_New Hybrid Quasi-Newton Methods For Solving Unconstrained Optimization Problems |
| title_full | 2019_New Hybrid Quasi-Newton Methods For Solving Unconstrained Optimization Problems |
| title_fullStr | 2019_New Hybrid Quasi-Newton Methods For Solving Unconstrained Optimization Problems |
| title_full_unstemmed | 2019_New Hybrid Quasi-Newton Methods For Solving Unconstrained Optimization Problems |
| title_short | 2019_New Hybrid Quasi-Newton Methods For Solving Unconstrained Optimization Problems |
| title_sort | 2019_new hybrid quasi-newton methods for solving unconstrained optimization problems |