A new two-step gradient-type method for large-scale unconstrained optimization

In this paper, we propose some improvements on a new gradient-type method for solving large-scale unconstrained optimization problems, in which we use data from two previous steps to revise the current approximate Hessian. The new method which we considered, resembles to that of Barzilai and Borwein...

Full description

Bibliographic Details
Main Authors: Farid, Mahboubeh, Leong, Wah June, Abu Hassan, Malik
Format: Article
Language:English
Published: Pergamon Press 2010
Online Access:http://psasir.upm.edu.my/id/eprint/12759/
http://psasir.upm.edu.my/id/eprint/12759/1/A%20new%20two.pdf
_version_ 1848841922916909056
author Farid, Mahboubeh
Leong, Wah June
Abu Hassan, Malik
author_facet Farid, Mahboubeh
Leong, Wah June
Abu Hassan, Malik
author_sort Farid, Mahboubeh
building UPM Institutional Repository
collection Online Access
description In this paper, we propose some improvements on a new gradient-type method for solving large-scale unconstrained optimization problems, in which we use data from two previous steps to revise the current approximate Hessian. The new method which we considered, resembles to that of Barzilai and Borwein (BB) method. The innovation features of this approach consist in using approximation of the Hessian in diagonal matrix form based on the modified weak secant equation rather than the multiple of the identity matrix in the BB method. Using this approach, we can obtain a higher order accuracy of Hessian approximation when compares to other existing BB-type method. By incorporating a simple monotone strategy, the global convergence of the new method is achieved. Practical insights into the effectiveness of the proposed method are given by numerical comparison with the BB method and its variant.
first_indexed 2025-11-15T07:50:56Z
format Article
id upm-12759
institution Universiti Putra Malaysia
institution_category Local University
language English
last_indexed 2025-11-15T07:50:56Z
publishDate 2010
publisher Pergamon Press
recordtype eprints
repository_type Digital Repository
spelling upm-127592015-09-22T03:38:42Z http://psasir.upm.edu.my/id/eprint/12759/ A new two-step gradient-type method for large-scale unconstrained optimization Farid, Mahboubeh Leong, Wah June Abu Hassan, Malik In this paper, we propose some improvements on a new gradient-type method for solving large-scale unconstrained optimization problems, in which we use data from two previous steps to revise the current approximate Hessian. The new method which we considered, resembles to that of Barzilai and Borwein (BB) method. The innovation features of this approach consist in using approximation of the Hessian in diagonal matrix form based on the modified weak secant equation rather than the multiple of the identity matrix in the BB method. Using this approach, we can obtain a higher order accuracy of Hessian approximation when compares to other existing BB-type method. By incorporating a simple monotone strategy, the global convergence of the new method is achieved. Practical insights into the effectiveness of the proposed method are given by numerical comparison with the BB method and its variant. Pergamon Press 2010-05 Article PeerReviewed application/pdf en http://psasir.upm.edu.my/id/eprint/12759/1/A%20new%20two.pdf Farid, Mahboubeh and Leong, Wah June and Abu Hassan, Malik (2010) A new two-step gradient-type method for large-scale unconstrained optimization. Computers and Mathematics with Applications, 59 (10). pp. 3301-3307. ISSN 0898-1221; ESSN: 1873-7668 10.1016/j.camwa.2010.03.014
spellingShingle Farid, Mahboubeh
Leong, Wah June
Abu Hassan, Malik
A new two-step gradient-type method for large-scale unconstrained optimization
title A new two-step gradient-type method for large-scale unconstrained optimization
title_full A new two-step gradient-type method for large-scale unconstrained optimization
title_fullStr A new two-step gradient-type method for large-scale unconstrained optimization
title_full_unstemmed A new two-step gradient-type method for large-scale unconstrained optimization
title_short A new two-step gradient-type method for large-scale unconstrained optimization
title_sort new two-step gradient-type method for large-scale unconstrained optimization
url http://psasir.upm.edu.my/id/eprint/12759/
http://psasir.upm.edu.my/id/eprint/12759/
http://psasir.upm.edu.my/id/eprint/12759/1/A%20new%20two.pdf