Implementation of Symmetric Rank-One Methods for Unconstrained Optimization

The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or...

Full description

Bibliographic Details
Main Author: Khiyaban, Farzin Modarres
Format: Thesis
Language:English
English
Published: 2010
Subjects:
Online Access:http://psasir.upm.edu.my/id/eprint/19590/
http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf
_version_ 1848843767236263936
author Khiyaban, Farzin Modarres
author_facet Khiyaban, Farzin Modarres
author_sort Khiyaban, Farzin Modarres
building UPM Institutional Repository
collection Online Access
description The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or may even be unavailable in the explicit form. QN methods endeavor to circumvent the deciencies of Newtons method (while retaining the basic structure and thus preserving, as far as possible, its advantages) by constructing approximations for the Hessian iteratively. Among QN updates, symmetric rank-one (SR1) update has been shown to be an e®ective and reliable method of such algorithms. However, SR1 is an awkward method, even though its performance is in general better than well known QN updates. The problem is that the SR1 update may not retain positive deniteness and may become undened because the denominator becomes zero. In recent years considerable attention has been directed towards preserving and ensuring the positive deniteness of SR1 update, but improving the quality of the estimates has rarely been studied in depth. Our purpose in this thesis is to improve the Hessian approximation updates and study the computational performance and convergence property of this update. First, we brie°y give some mathematical background. A review of di®erent minimization methods that can be used to solve unconstrained optimization problems is also given. We consider a modification of secant equation for the SR1 update. In this method, the Hessian approximation is updated based on modifed secant equation, which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objec- tive function. We then examine a new scaled memoryless SR1 method based on modied secant equation for solving large-scale unconstrained optimization problems. We prove that the new method possesses global convergence. The rate of convergence of such algorithms are also discussed. Due to the presence of SR1 deciencies, we introduce a restarting procedure using eigenvalue of the SR1 update. We also introduce a variety of techniques to improve Hessian approximations of the SR1 method for small to large-sized problems, including multi-step, extra updating methods along with the structured method which uses partial information on Hessian. Variants of SR1 update are tested numerically and compared to several other famous minimization methods. Finally, we comment on some achievement in our research. Possible extensions are also given to conclude this thesis.
first_indexed 2025-11-15T08:20:15Z
format Thesis
id upm-19590
institution Universiti Putra Malaysia
institution_category Local University
language English
English
last_indexed 2025-11-15T08:20:15Z
publishDate 2010
recordtype eprints
repository_type Digital Repository
spelling upm-195902013-05-27T08:02:40Z http://psasir.upm.edu.my/id/eprint/19590/ Implementation of Symmetric Rank-One Methods for Unconstrained Optimization Khiyaban, Farzin Modarres The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or may even be unavailable in the explicit form. QN methods endeavor to circumvent the deciencies of Newtons method (while retaining the basic structure and thus preserving, as far as possible, its advantages) by constructing approximations for the Hessian iteratively. Among QN updates, symmetric rank-one (SR1) update has been shown to be an e®ective and reliable method of such algorithms. However, SR1 is an awkward method, even though its performance is in general better than well known QN updates. The problem is that the SR1 update may not retain positive deniteness and may become undened because the denominator becomes zero. In recent years considerable attention has been directed towards preserving and ensuring the positive deniteness of SR1 update, but improving the quality of the estimates has rarely been studied in depth. Our purpose in this thesis is to improve the Hessian approximation updates and study the computational performance and convergence property of this update. First, we brie°y give some mathematical background. A review of di®erent minimization methods that can be used to solve unconstrained optimization problems is also given. We consider a modification of secant equation for the SR1 update. In this method, the Hessian approximation is updated based on modifed secant equation, which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objec- tive function. We then examine a new scaled memoryless SR1 method based on modied secant equation for solving large-scale unconstrained optimization problems. We prove that the new method possesses global convergence. The rate of convergence of such algorithms are also discussed. Due to the presence of SR1 deciencies, we introduce a restarting procedure using eigenvalue of the SR1 update. We also introduce a variety of techniques to improve Hessian approximations of the SR1 method for small to large-sized problems, including multi-step, extra updating methods along with the structured method which uses partial information on Hessian. Variants of SR1 update are tested numerically and compared to several other famous minimization methods. Finally, we comment on some achievement in our research. Possible extensions are also given to conclude this thesis. 2010-12 Thesis NonPeerReviewed application/pdf en http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf Khiyaban, Farzin Modarres (2010) Implementation of Symmetric Rank-One Methods for Unconstrained Optimization. PhD thesis, Universiti Putra Malaysia. Newton-Raphson method. English
spellingShingle Newton-Raphson method.
Khiyaban, Farzin Modarres
Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_full Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_fullStr Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_full_unstemmed Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_short Implementation of Symmetric Rank-One Methods for Unconstrained Optimization
title_sort implementation of symmetric rank-one methods for unconstrained optimization
topic Newton-Raphson method.
url http://psasir.upm.edu.my/id/eprint/19590/
http://psasir.upm.edu.my/id/eprint/19590/1/FS_2010_41_F.pdf