Scaled memoryless symmetric rank one method for large-scale optimization.

This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method for which the approximation to the inverse of Hessian, at each step, is updated from the identity matrix. Hence its search direction can be computed without the storage of matrices. In this paper, a sca...

Full description

Bibliographic Details
Main Authors: Leong, Wah June, Abu Hassan, Malik
Format: Article
Language:English
Published: Elsevier 2011
Online Access:http://psasir.upm.edu.my/id/eprint/24641/
http://psasir.upm.edu.my/id/eprint/24641/1/Scaled%20memoryless%20symmetric%20rank%20one%20method%20for%20large.pdf
_version_ 1848845091574120448
author Leong, Wah June
Abu Hassan, Malik
author_facet Leong, Wah June
Abu Hassan, Malik
author_sort Leong, Wah June
building UPM Institutional Repository
collection Online Access
description This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method for which the approximation to the inverse of Hessian, at each step, is updated from the identity matrix. Hence its search direction can be computed without the storage of matrices. In this paper, a scaled memoryless symmetric rank one (SR1) method for solving large-scale unconstrained optimization problems is developed. The basic idea is to incorporate the SR1 update within the framework of the memoryless quasi-Newton method. However, it is well-known that the SR1 update may not preserve positive definiteness even when updated from a positive definite matrix. Therefore we propose the memoryless SR1 method, which is updated from a positive scaled of the identity, where the scaling factor is derived in such a way that positive definiteness of the updating matrices are preserved and at the same time improves the condition of the scaled memoryless SR1 update. Under very mild conditions it is shown that, for strictly convex objective functions, the method is globally convergent with a linear rate of convergence. Numerical results show that the optimally scaled memoryless SR1 method is very encouraging.
first_indexed 2025-11-15T08:41:18Z
format Article
id upm-24641
institution Universiti Putra Malaysia
institution_category Local University
language English
last_indexed 2025-11-15T08:41:18Z
publishDate 2011
publisher Elsevier
recordtype eprints
repository_type Digital Repository
spelling upm-246412016-01-14T03:42:11Z http://psasir.upm.edu.my/id/eprint/24641/ Scaled memoryless symmetric rank one method for large-scale optimization. Leong, Wah June Abu Hassan, Malik This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method for which the approximation to the inverse of Hessian, at each step, is updated from the identity matrix. Hence its search direction can be computed without the storage of matrices. In this paper, a scaled memoryless symmetric rank one (SR1) method for solving large-scale unconstrained optimization problems is developed. The basic idea is to incorporate the SR1 update within the framework of the memoryless quasi-Newton method. However, it is well-known that the SR1 update may not preserve positive definiteness even when updated from a positive definite matrix. Therefore we propose the memoryless SR1 method, which is updated from a positive scaled of the identity, where the scaling factor is derived in such a way that positive definiteness of the updating matrices are preserved and at the same time improves the condition of the scaled memoryless SR1 update. Under very mild conditions it is shown that, for strictly convex objective functions, the method is globally convergent with a linear rate of convergence. Numerical results show that the optimally scaled memoryless SR1 method is very encouraging. Elsevier 2011-09-15 Article PeerReviewed application/pdf en http://psasir.upm.edu.my/id/eprint/24641/1/Scaled%20memoryless%20symmetric%20rank%20one%20method%20for%20large.pdf Leong, Wah June and Abu Hassan, Malik (2011) Scaled memoryless symmetric rank one method for large-scale optimization. Applied Mathematics and Computation, 218 (2). pp. 413-418. ISSN 0096-3003 http://www.elsevier.com/ 10.1016/j.amc.2011.05.080
spellingShingle Leong, Wah June
Abu Hassan, Malik
Scaled memoryless symmetric rank one method for large-scale optimization.
title Scaled memoryless symmetric rank one method for large-scale optimization.
title_full Scaled memoryless symmetric rank one method for large-scale optimization.
title_fullStr Scaled memoryless symmetric rank one method for large-scale optimization.
title_full_unstemmed Scaled memoryless symmetric rank one method for large-scale optimization.
title_short Scaled memoryless symmetric rank one method for large-scale optimization.
title_sort scaled memoryless symmetric rank one method for large-scale optimization.
url http://psasir.upm.edu.my/id/eprint/24641/
http://psasir.upm.edu.my/id/eprint/24641/
http://psasir.upm.edu.my/id/eprint/24641/
http://psasir.upm.edu.my/id/eprint/24641/1/Scaled%20memoryless%20symmetric%20rank%20one%20method%20for%20large.pdf