Multiple Alternate Steps Gradient Methods For Unconstrained Optimization

The focus of this thesis is on finding the unconstrained minimizer of a function by using the alternate steps gradient methods. Specifically, we will focus on the well-known classes of gradient methods called the steepest descent (SD) method and Barzilai-Borwein (BB) method. First we briefly give...

Full description

Bibliographic Details
Main Author: Lee, Sui Fong
Format: Thesis
Language:English
English
Published: 2009
Subjects:
Online Access:http://psasir.upm.edu.my/id/eprint/12367/
http://psasir.upm.edu.my/id/eprint/12367/1/IPM_2009_11A.pdf
_version_ 1848841825516781568
author Lee, Sui Fong
author_facet Lee, Sui Fong
author_sort Lee, Sui Fong
building UPM Institutional Repository
collection Online Access
description The focus of this thesis is on finding the unconstrained minimizer of a function by using the alternate steps gradient methods. Specifically, we will focus on the well-known classes of gradient methods called the steepest descent (SD) method and Barzilai-Borwein (BB) method. First we briefly give some mathematical background on unconstrained optimization as well as the gradient methods. Then we discuss the SD and BB methods, the fundamental gradient methods which are used in the gradient method alternately to solve the problems of optimization. Some general and local convergence analyses of SD and BB methods are given, as well as the related so-called line search method.A review on the alternate step (AS) gradient method with brief numerical results and convergence analyses are also presented. The main practical deficiency of SD method is the directions generated along the line tend to two different directions, which causes the SD method performs poorly and requires more computational work. Though BB method does not guarantee a descent in the objective function at each iteration due to it nonmonotone behavior, it performs better than SD method in this case. Motivated by these limitations, we introduce a new gradient method for improving the SD and BB method namely the Multiple Alternate Steps (MAS) gradient methods. The convergence of MAS method is investigated. Analysis on the behavior of MAS method is also performed. Furthermore, we also presented the numerical results on quadratics test problems in order to compare the numerical performance of MAS method with SD, BB and AS methods. The purpose of this research is to study a working knowledge of optimization theory and methods. We hope that the new MAS gradient method can give significant research contribution in our daily life application. For example, in maximizing the profit of a manufacturing operation or improving a system in certain ways to reduce the effective runtime in computer science. Finally we comment on some achievements in our researches. Possible extensions are also given to conclude this thesis.
first_indexed 2025-11-15T07:49:24Z
format Thesis
id upm-12367
institution Universiti Putra Malaysia
institution_category Local University
language English
English
last_indexed 2025-11-15T07:49:24Z
publishDate 2009
recordtype eprints
repository_type Digital Repository
spelling upm-123672013-05-27T07:51:53Z http://psasir.upm.edu.my/id/eprint/12367/ Multiple Alternate Steps Gradient Methods For Unconstrained Optimization Lee, Sui Fong The focus of this thesis is on finding the unconstrained minimizer of a function by using the alternate steps gradient methods. Specifically, we will focus on the well-known classes of gradient methods called the steepest descent (SD) method and Barzilai-Borwein (BB) method. First we briefly give some mathematical background on unconstrained optimization as well as the gradient methods. Then we discuss the SD and BB methods, the fundamental gradient methods which are used in the gradient method alternately to solve the problems of optimization. Some general and local convergence analyses of SD and BB methods are given, as well as the related so-called line search method.A review on the alternate step (AS) gradient method with brief numerical results and convergence analyses are also presented. The main practical deficiency of SD method is the directions generated along the line tend to two different directions, which causes the SD method performs poorly and requires more computational work. Though BB method does not guarantee a descent in the objective function at each iteration due to it nonmonotone behavior, it performs better than SD method in this case. Motivated by these limitations, we introduce a new gradient method for improving the SD and BB method namely the Multiple Alternate Steps (MAS) gradient methods. The convergence of MAS method is investigated. Analysis on the behavior of MAS method is also performed. Furthermore, we also presented the numerical results on quadratics test problems in order to compare the numerical performance of MAS method with SD, BB and AS methods. The purpose of this research is to study a working knowledge of optimization theory and methods. We hope that the new MAS gradient method can give significant research contribution in our daily life application. For example, in maximizing the profit of a manufacturing operation or improving a system in certain ways to reduce the effective runtime in computer science. Finally we comment on some achievements in our researches. Possible extensions are also given to conclude this thesis. 2009-09 Thesis NonPeerReviewed application/pdf en http://psasir.upm.edu.my/id/eprint/12367/1/IPM_2009_11A.pdf Lee, Sui Fong (2009) Multiple Alternate Steps Gradient Methods For Unconstrained Optimization. Masters thesis, Universiti Putra Malaysia. Conjugate gradient methods Mathematical optimization English
spellingShingle Conjugate gradient methods
Mathematical optimization
Lee, Sui Fong
Multiple Alternate Steps Gradient Methods For Unconstrained Optimization
title Multiple Alternate Steps Gradient Methods For Unconstrained Optimization
title_full Multiple Alternate Steps Gradient Methods For Unconstrained Optimization
title_fullStr Multiple Alternate Steps Gradient Methods For Unconstrained Optimization
title_full_unstemmed Multiple Alternate Steps Gradient Methods For Unconstrained Optimization
title_short Multiple Alternate Steps Gradient Methods For Unconstrained Optimization
title_sort multiple alternate steps gradient methods for unconstrained optimization
topic Conjugate gradient methods
Mathematical optimization
url http://psasir.upm.edu.my/id/eprint/12367/
http://psasir.upm.edu.my/id/eprint/12367/1/IPM_2009_11A.pdf