2021_New Search Direction Formula Of Steepest Descent Method for Large Scale Unconstrained Optimization
| Format: | General Document |
|---|
| _version_ | 1860798156883623936 |
|---|---|
| building | INTELEK Repository |
| collection | Online Access |
| collectionurl | https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 |
| copyright | Copyright©PWB2025 |
| country | Malaysia |
| date | 2021-03-15 |
| format | General Document |
| id | 16216 |
| institution | UniSZA |
| originalfilename | 16216_0f8ba60541470f0.pdf |
| person | Siti Farhana Binti Husin |
| recordtype | oai_dc |
| resourceurl | https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16216 |
| sourcemedia | Server storage Scanned document |
| spelling | 16216 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16216 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 General Document Malaysia Library Staff (Top Management) Library Staff (Management) Library Staff (Support) Terengganu Faculty of Informatics & Computing English application/pdf 1.5 Server storage Scanned document Universiti Sultan Zainal Abidin UniSZA Private Access UNIVERSITI SULTAN ZAINAL ABIDIN SAMBox 2.3.4; modified using iTextSharp™ 5.5.10 ©2000-2016 iText Group NV (AGPL-version) Copyright©PWB2025 2021-03-15 275 16216_0f8ba60541470f0.pdf Siti Farhana Binti Husin Steepest Descent Method Search Direction Formula 2021_New Search Direction Formula Of Steepest Descent Method for Large Scale Unconstrained Optimization One of the simplest optimization methods for solving unconstrained optimization problems is the steepest descent (SD) method. The main advantage of the SD algorithm with exact line search is that it satisfies global convergence properties under suitable assumptions. This method requires only the first derivative to be solved for the search direction which leads to low computational cost and storage requirements. However, the method has a slow convergence rate based on the higher number of iterations and central processing unit (CPU) time. Therefore, a new SD method that possesses lower iterations and CPU time is needed. This research concerns on the development of the SD method for solving large-scale nonlinear unconstrained optimization problems by suggesting new search direction in SD algorithm. This study focuses on the modification of search direction for SD methods by adding new parameters to the classical SD direction with the first suggestion is in two-term direction. Secondly, this work has also developed a three-term search direction for the SD method with two different parameters. The proposed method is specifically designed for solving large scale optimization problems under exact line search procedures. A new formulation of search direction for SD method combined with the conjugate gradient coefficients has been suggested. Twenty-six test problems are tested under different initial points ranging from two to five thousand variables. Numerical results for all of these methods are compared with existing SD methods based on the number of iterations and CPU time in which each method is evaluated over the same set of test problems and are interpreted by using the performance profile. The applicability of the introduced methods is shown by applying in least square method to solve some chosen nonlinear ordinary differential equations and to be implemented on data fitting through regression analysis. A set of data regarding relationship between fin length and total length of silky shark species has been chosen to construct a linear regression model. Theoretical proofs showed that all of the proposed search directions fulfil sufficient descent conditions and the global convergence properties. Numerical results using performance profile indicate that all of these methods give superior performance compared to the Classical SD (SDC), Zubai’ah, Mustafa, Rivaie and Ismail (ZMRI) and Rashidah, Rivaie, Mustafa (RRM) methods as they are able to lessen the number of iterations and CPU time. Results also show that all of these new methods are applicable in daily life problem and could produce useful regression equations. In conclusion, the numerical results for all the proposed methods are able to minimize the number of iterations and CPU time. Besides, the methods also have capabilities to be implemented in the least square method and regression analysis for solving the nonlinear ordinary differential equations and real-life problems. Dissertations, Academic Unconstrained Optimization Steepest Descent Method Thesis |
| spellingShingle | 2021_New Search Direction Formula Of Steepest Descent Method for Large Scale Unconstrained Optimization |
| state | Terengganu |
| subject | Steepest Descent Method Dissertations, Academic |
| summary | One of the simplest optimization methods for solving unconstrained optimization problems is the steepest descent (SD) method. The main advantage of the SD algorithm with exact line search is that it satisfies global convergence properties under suitable assumptions. This method requires only the first derivative to be solved for the search direction which leads to low computational cost and storage requirements. However, the method has a slow convergence rate based on the higher number of iterations and central processing unit (CPU) time. Therefore, a new SD method that possesses lower iterations and CPU time is needed. This research concerns on the development of the SD method for solving large-scale nonlinear unconstrained optimization problems by suggesting new search direction in SD algorithm. This study focuses on the modification of search direction for SD methods by adding new parameters to the classical SD direction with the first suggestion is in two-term direction. Secondly, this work has also developed a three-term search direction for the SD method with two different parameters. The proposed method is specifically designed for solving large scale optimization problems under exact line search procedures. A new formulation of search direction for SD method combined with the conjugate gradient coefficients has been suggested. Twenty-six test problems are tested under different initial points ranging from two to five thousand variables. Numerical results for all of these methods are compared with existing SD methods based on the number of iterations and CPU time in which each method is evaluated over the same set of test problems and are interpreted by using the performance profile. The applicability of the introduced methods is shown by applying in least square method to solve some chosen nonlinear ordinary differential equations and to be implemented on data fitting through regression analysis. A set of data regarding relationship between fin length and total length of silky shark species has been chosen to construct a linear regression model. Theoretical proofs showed that all of the proposed search directions fulfil sufficient descent conditions and the global convergence properties. Numerical results using performance profile indicate that all of these methods give superior performance compared to the Classical SD (SDC), Zubai’ah, Mustafa, Rivaie and Ismail (ZMRI) and Rashidah, Rivaie, Mustafa (RRM) methods as they are able to lessen the number of iterations and CPU time. Results also show that all of these new methods are applicable in daily life problem and could produce useful regression equations. In conclusion, the numerical results for all the proposed methods are able to minimize the number of iterations and CPU time. Besides, the methods also have capabilities to be implemented in the least square method and regression analysis for solving the nonlinear ordinary differential equations and real-life problems. |
| title | 2021_New Search Direction Formula Of Steepest Descent Method for Large Scale Unconstrained Optimization |
| title_full | 2021_New Search Direction Formula Of Steepest Descent Method for Large Scale Unconstrained Optimization |
| title_fullStr | 2021_New Search Direction Formula Of Steepest Descent Method for Large Scale Unconstrained Optimization |
| title_full_unstemmed | 2021_New Search Direction Formula Of Steepest Descent Method for Large Scale Unconstrained Optimization |
| title_short | 2021_New Search Direction Formula Of Steepest Descent Method for Large Scale Unconstrained Optimization |
| title_sort | 2021_new search direction formula of steepest descent method for large scale unconstrained optimization |