2019_New Conjugate Gradient Methods Using Strong Wolfe Line Search For Estimating Dividend Rate

Bibliographic Details
Format: General Document
_version_ 1860798154997235712
building INTELEK Repository
collection Online Access
collectionurl https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3
copyright Copyright©PWB2025
country Malaysia
date 2019-08-16
format General Document
id 16209
institution UniSZA
originalfilename NEW CONJUGATE GRADIENT METHODS USING STRONG WOLFE LINE SEARCH FOR ESTIMATING DIVIDEND RATE (PHD_2019).pdf
person Norhaslinda Binti Zull Pakkal
recordtype oai_dc
resourceurl https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16209
sourcemedia Server storage
Scanned document
spelling 16209 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=16209 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection3 General Document Malaysia Library Staff (Top Management) Library Staff (Management) Library Staff (Support) Terengganu Faculty of Informatics & Computing English application/pdf 20 1.7 Server storage Scanned document Universiti Sultan Zainal Abidin UniSZA Private Access UNIVERSITI SULTAN ZAINAL ABIDIN PDFium Copyright©PWB2025 Conjugate gradient methods 2019-08-16 NEW CONJUGATE GRADIENT METHODS USING STRONG WOLFE LINE SEARCH FOR ESTIMATING DIVIDEND RATE (PHD_2019).pdf Strong Wolfe Line Search Optimization Algorithms Norhaslinda Binti Zull Pakkal 2019_New Conjugate Gradient Methods Using Strong Wolfe Line Search For Estimating Dividend Rate Conjugate gradient (CG) method is a very popular method for solving unconstrained optimization problems. Various ongoing studies on CG method have led to improvements of its efficiency. Among the advantages of CG method are its simple algorithm, low memory storage and global convergence properties which make it efficient in solving large scale unconstrained optimization problems. However, a few existing CG methods produce higher iteration number and Central Processing Unit (CPU) time to solve the unconstrained optimization problems. Besides, some of them are not applicable for daily life. In order to overcome these problems, new CG methods with lesser iteration number and CPU time for data fitting are introduced. In this research, two new CG methods namely LAMR and LAMR+ are introduced which stand for Linda – ‘Aini – Mustafa – Rivaie. The idea of the LAMR method is motivated from the Rivaie – Mustafa – Ismail – Leong (RMIL) method by retaining the restart properties. The LAMR+ method is an improvement of LAMR method by eliminating the negative values of LAMR. The proofs of convergence analysis for LAMR and LAMR+ under inexact strong Wolfe line search are given. Then, these new methods are tested and compared with four famous CG coefficients which are RMIL, Wei – Yao – Liu (WYL), Hestenes – Steifel (HS) and Conjugate Descent (CD). All of the methods have been tested with different number of variables from 2 to 10000 on twenty-eight standard unconstrained optimization functions by utilizing MatlabR2012 subroutine programming. Four different initial points have been selected for each unconstrained optimization functions in order to validate the efficiency and robustness of the introduced methods. The chosen initial points start from the point close to the point far away from the minimum point. The number of iteration and CPU time of the tested methods are recorded and analyzed by using performance profile. Then, the applicability of the introduced methods is demonstrated by implementing them on data fitting through regression analysis. A real data set concerning Employees’ Provident Fund (EPF) dividend rate has been chosen to construct the linear and quadratic regression models. Theoretically, the introduced methods possess the sufficient descent and global convergence properties. Based on the numerical analysis, LAMR+ method is able to solve all the functions. Then, it is followed by LAMR, WYL, HS, RMIL and CD methods which solve 98.8%, 95.2%, 85.6%, 77.2% and 75.0% of functions respectively. Both of the methods have been proven to be applicable in daily life problem. Based on the tested functions, these new methods show better results compared to the existing methods. Hence, these methods can be considered as alternative methods which are competitive with the other CG methods. Dissertations, Academic Conjugate Gradient Methods Thesis
spellingShingle 2019_New Conjugate Gradient Methods Using Strong Wolfe Line Search For Estimating Dividend Rate
state Terengganu
subject Conjugate gradient methods
Dissertations, Academic
summary Conjugate gradient (CG) method is a very popular method for solving unconstrained optimization problems. Various ongoing studies on CG method have led to improvements of its efficiency. Among the advantages of CG method are its simple algorithm, low memory storage and global convergence properties which make it efficient in solving large scale unconstrained optimization problems. However, a few existing CG methods produce higher iteration number and Central Processing Unit (CPU) time to solve the unconstrained optimization problems. Besides, some of them are not applicable for daily life. In order to overcome these problems, new CG methods with lesser iteration number and CPU time for data fitting are introduced. In this research, two new CG methods namely LAMR and LAMR+ are introduced which stand for Linda – ‘Aini – Mustafa – Rivaie. The idea of the LAMR method is motivated from the Rivaie – Mustafa – Ismail – Leong (RMIL) method by retaining the restart properties. The LAMR+ method is an improvement of LAMR method by eliminating the negative values of LAMR. The proofs of convergence analysis for LAMR and LAMR+ under inexact strong Wolfe line search are given. Then, these new methods are tested and compared with four famous CG coefficients which are RMIL, Wei – Yao – Liu (WYL), Hestenes – Steifel (HS) and Conjugate Descent (CD). All of the methods have been tested with different number of variables from 2 to 10000 on twenty-eight standard unconstrained optimization functions by utilizing MatlabR2012 subroutine programming. Four different initial points have been selected for each unconstrained optimization functions in order to validate the efficiency and robustness of the introduced methods. The chosen initial points start from the point close to the point far away from the minimum point. The number of iteration and CPU time of the tested methods are recorded and analyzed by using performance profile. Then, the applicability of the introduced methods is demonstrated by implementing them on data fitting through regression analysis. A real data set concerning Employees’ Provident Fund (EPF) dividend rate has been chosen to construct the linear and quadratic regression models. Theoretically, the introduced methods possess the sufficient descent and global convergence properties. Based on the numerical analysis, LAMR+ method is able to solve all the functions. Then, it is followed by LAMR, WYL, HS, RMIL and CD methods which solve 98.8%, 95.2%, 85.6%, 77.2% and 75.0% of functions respectively. Both of the methods have been proven to be applicable in daily life problem. Based on the tested functions, these new methods show better results compared to the existing methods. Hence, these methods can be considered as alternative methods which are competitive with the other CG methods.
title 2019_New Conjugate Gradient Methods Using Strong Wolfe Line Search For Estimating Dividend Rate
title_full 2019_New Conjugate Gradient Methods Using Strong Wolfe Line Search For Estimating Dividend Rate
title_fullStr 2019_New Conjugate Gradient Methods Using Strong Wolfe Line Search For Estimating Dividend Rate
title_full_unstemmed 2019_New Conjugate Gradient Methods Using Strong Wolfe Line Search For Estimating Dividend Rate
title_short 2019_New Conjugate Gradient Methods Using Strong Wolfe Line Search For Estimating Dividend Rate
title_sort 2019_new conjugate gradient methods using strong wolfe line search for estimating dividend rate