A new edition of conjugate gradient methods for large-scale unconstrained optimization

Bibliographic Details
Format: Restricted Document
_version_ 1860797094553452544
building INTELEK Repository
collection Online Access
collectionurl https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072
date 2014-12-10 15:00:13
format Restricted Document
id 11350
institution UniSZA
internalnotes [1] M. Al-Baali, Descent property and global convergence of the Fletcher-Reeves method with inexact line search, IMA J. Numer. Anal. 5 (1985) 121-124. [2] N. Andrei, An unconstrained optimization test function collection, Adv. Mod. and Optimization, 10(2008), 147-161. [3] E. K. P. Chong and S. H. Zak, An introduction to optimization (3nd ed). John Wiley and Sons, New York (2008). [4] Y. H. Dai, J. Han, G. Liu, D. Sun, H. Yin, Y. Yuan, Convergence properties of nonlinear conjugate gradient methods, SIAM J. Optim. 10(2000), 345-358. [5] Y. H. Dai and Y. Yuan, Nonlinear conjugate gradient method, Beijing, Shanghai Scientific and Technical Publishers, (1998). [6] Y. H. Dai and Y. Yuan, A nonlinear conjugate gradient with a strong global convergence properties, SIAM J. Optim., 10(1999), 177-182. [7] Y. H. Dai and Y. Yuan, A note on the nonlinear conjugate gradient method, J. Compt. Appl. Math., 18(2002), 575-582. [8] E. Dolan and J. J. More, Benchmarking optimization software with performance profile, Maths. Prog., 91(2002), 201-213. [9] R. Fletcher, Practical methods of optimization, vol 1, unconstrained optimization, New York, J. Wiley and Sons, (1987). [10] R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput. J., 7(1964), 149-154. [11] M. R. Hestenes and E. Steifel, Method of conjugate gradient for solving linear equations, J. Res. Nat. Bur. Stand., 49(1952), 409-436. [12] K.E. Hilstrom, A simulation test approach to the evaluation of nonlinear optimization algorithms, A.C.M. Trans. Maths. Softw.3, 4(1977), 305-315. [13] J. Nocedal and S. J. Wright, Numerical Optimization: Springer Series in optimization Research, Springer- Vrelag, Berlin (2006). [14] Y. Narushima, H. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM J. Optim., 21(2011), 212-230. [15] E. Polak and G. Ribiere, Note on the convergence of methods of conjugate directions, Rev. Francaise Informat Recherche Operationelle, 3E Annee , 16(1969), 35-43. [16] M. Rivaie, M. Mamat, L. W. June and I. Mohd, A new conjugate gradient coefficient for large scale nonlinear unconstrained optimization, International Journal of Mathematical Analysis, 6 (2012), 1131-1146. [17] Y. Yuan and W. Sun, Theory and methods of optimization, Beijing, Science Press of China, (1999). [18] G. Zoutendijk, Nonlinear programming computational methods, Abadie J.(Ed.) Integer and nonlinear programming, (1970), 37-86.
originalfilename 5572-01-FH02-FIK-14-02086.jpg
person UniSZA
Unisza
unisza
recordtype oai_dc
resourceurl https://intelek.unisza.edu.my/intelek/pages/view.php?ref=11350
spelling 11350 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=11350 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 Restricted Document Article Journal UniSZA Unisza unisza image/jpeg inches 96 96 787 13 13 2014-12-10 15:00:13 1396x787 1396 5572-01-FH02-FIK-14-02086.jpg UniSZA Private Access A new edition of conjugate gradient methods for large-scale unconstrained optimization International Journal of Mathematical Analysis Conjugate gradient (CG) methods are famous for solving nonlinear unconstrained optimization problems because they required low computational memory. In this paper, we propose a new CG coefficient ( βk ) which possesses global convergence properties using exact line search. The given method satisfies sufficient descent condition under strong Wolfe line search. Numerical results based on the number of iterations and central processing unit (CPU) time, have shown that the new βk performs better than some other well known CG methods. 8 46 HIKARI Ltd. HIKARI Ltd. 2277-2291 [1] M. Al-Baali, Descent property and global convergence of the Fletcher-Reeves method with inexact line search, IMA J. Numer. Anal. 5 (1985) 121-124. [2] N. Andrei, An unconstrained optimization test function collection, Adv. Mod. and Optimization, 10(2008), 147-161. [3] E. K. P. Chong and S. H. Zak, An introduction to optimization (3nd ed). John Wiley and Sons, New York (2008). [4] Y. H. Dai, J. Han, G. Liu, D. Sun, H. Yin, Y. Yuan, Convergence properties of nonlinear conjugate gradient methods, SIAM J. Optim. 10(2000), 345-358. [5] Y. H. Dai and Y. Yuan, Nonlinear conjugate gradient method, Beijing, Shanghai Scientific and Technical Publishers, (1998). [6] Y. H. Dai and Y. Yuan, A nonlinear conjugate gradient with a strong global convergence properties, SIAM J. Optim., 10(1999), 177-182. [7] Y. H. Dai and Y. Yuan, A note on the nonlinear conjugate gradient method, J. Compt. Appl. Math., 18(2002), 575-582. [8] E. Dolan and J. J. More, Benchmarking optimization software with performance profile, Maths. Prog., 91(2002), 201-213. [9] R. Fletcher, Practical methods of optimization, vol 1, unconstrained optimization, New York, J. Wiley and Sons, (1987). [10] R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput. J., 7(1964), 149-154. [11] M. R. Hestenes and E. Steifel, Method of conjugate gradient for solving linear equations, J. Res. Nat. Bur. Stand., 49(1952), 409-436. [12] K.E. Hilstrom, A simulation test approach to the evaluation of nonlinear optimization algorithms, A.C.M. Trans. Maths. Softw.3, 4(1977), 305-315. [13] J. Nocedal and S. J. Wright, Numerical Optimization: Springer Series in optimization Research, Springer- Vrelag, Berlin (2006). [14] Y. Narushima, H. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM J. Optim., 21(2011), 212-230. [15] E. Polak and G. Ribiere, Note on the convergence of methods of conjugate directions, Rev. Francaise Informat Recherche Operationelle, 3E Annee , 16(1969), 35-43. [16] M. Rivaie, M. Mamat, L. W. June and I. Mohd, A new conjugate gradient coefficient for large scale nonlinear unconstrained optimization, International Journal of Mathematical Analysis, 6 (2012), 1131-1146. [17] Y. Yuan and W. Sun, Theory and methods of optimization, Beijing, Science Press of China, (1999). [18] G. Zoutendijk, Nonlinear programming computational methods, Abadie J.(Ed.) Integer and nonlinear programming, (1970), 37-86.
spellingShingle A new edition of conjugate gradient methods for large-scale unconstrained optimization
summary Conjugate gradient (CG) methods are famous for solving nonlinear unconstrained optimization problems because they required low computational memory. In this paper, we propose a new CG coefficient ( βk ) which possesses global convergence properties using exact line search. The given method satisfies sufficient descent condition under strong Wolfe line search. Numerical results based on the number of iterations and central processing unit (CPU) time, have shown that the new βk performs better than some other well known CG methods.
title A new edition of conjugate gradient methods for large-scale unconstrained optimization
title_full A new edition of conjugate gradient methods for large-scale unconstrained optimization
title_fullStr A new edition of conjugate gradient methods for large-scale unconstrained optimization
title_full_unstemmed A new edition of conjugate gradient methods for large-scale unconstrained optimization
title_short A new edition of conjugate gradient methods for large-scale unconstrained optimization
title_sort new edition of conjugate gradient methods for large-scale unconstrained optimization