Global convergence of a new spectral conjugate gradient by using strong wolfe line search

Bibliographic Details
Format: Restricted Document
_version_ 1860797271027744768
building INTELEK Repository
collection Online Access
collectionurl https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072
date 2015-07-01 12:16:12
format Restricted Document
id 12028
institution UniSZA
internalnotes [1] J. Barzilai and J. M. Borwein, “Two-point step size gradient methods,” IMA Journal of Numerical Analysis, 8 (1) (1988), 141-148. http://dx.doi.org/10.1093/imanum/8.1.141 [2] M. Raydan, “The Barzilai and J. M. Borwein gradient method for the large scale unconstrained minimization problem,” SIAM Journal on Optimization, 7 (1) (1997), 26-33. http://dx.doi.org/10.1137/s1052623494266365 [3] M. Rivaie, M. Mamat, W.J. Leong, and I. Mohd. “A New Class of Nonlinear Conjugate Gradient Coefficients with Global Convergence Properties.” Applied Mathematics and Computations. 218 (2012), 11323-11332. http://dx.doi.org/10.1016/j.amc.2012.05.030 [4] R. Fletcher, Practical methods of optimization, Vol. 1, Unconstrained optimization Wiley, New York, 1987. [5] E. Polak, and G. Ribiere, Note sur la convergence de directions conjugees. Rev.Francaise Informat Recherche Operationelle, 3 (16) (1969), 35-43. [6] M. R. Hestenes and E. Steifel, Method of conjugate gradient for solving linear equations, J,Res.Nat.Bur.Stand., 49 (1952), 409-436. http://dx.doi.org/10.6028/jres.049.044 [7] R. Fletcher and C. Reeves, “Function Minimization by Conjugate Gradients”, Comput.J., 7 (1964), 149-154. http://dx.doi.org/10.1093/comjnl/7.2.149 [8] G. Zoutendijk, “Nonlinear Programming Computational Methods”, in Integer and Nonlinear Programming, J.Abadie (editor), 1970, 37-86. [9] M. J. D. Powell, Nonconvex Minimizations Calculations and the Conjugate Gradient Method, Lecture Notes in Mathematics, Berlin, Springer, 1066 (1984), 122-141. http://dx.doi.org/10.1007/bfb0099521 [10] M. J. D. Powell, Restart Procedures for the Conjugate Gradient Method, Mathematical Programming, 12 (1977), 241-254. http://dx.doi.org/10.1007/bf01593790 [11] L. Jinkiu and J. Youyi, “Global Convergence of a Spectral Conjugate Gradient Method for unconstrained Optimization”, Hindawi Publishing Corporation Abstract and Applied Analysis, Volume 2012, Article ID 758287, 12 pages. http://dx.doi.org/10.1155/2012/758287 [12] E. G. Birgin and J. M. Martinez, “A spectral conjugate gradient method for unconstrained optimization,” Applied Mathematics and Optimization, 43 (2), 117-128, 2001. http://dx.doi.org/10.1007/s00245-001-0003-0 [13] Wolfe, P. (1971). Convergence conditions for ascent method. II: some corrections. SIAM Rev., 13 (2), 185-188. http://dx.doi.org/10.1137/1013035 [14] M. Molga and C. Smutnicki, Test Functions for optimization needs, 2005. www.zsd.ict.pwr.wroc.pl/files/docs/functions.pdf [15] A. A. Goldstein, On steepest descent, SIAM J. Control. 3 (1965) 147–151. http://dx.doi.org/10.1137/0303013 [16] L. Armijo, Minimization of functions having Lipschitz continuous partial derivatives, Pacific J. Math. 16 (1966), 1–3. http://dx.doi.org/10.2140/pjm.1966.16.1 [17] P. Wolfe, Convergence conditions for ascent method, SIAM Rev. 11 (1969), 226–235. http://dx.doi.org/10.1137/1011036 [18] N. Andrei, An Unconstrained Optimization Test Functions Collection, Advanced Modeling and Optimization, 10 (1) (2008), 147-161. [19] N. H. M. Yussoff, M. Mamat, M. Rivaie and I. Mohd, “A new conjugate gradient method for unconstrained optimization with sufficient descent”, AIP Conference Proceedings, 1602 (2014), 514-519. http://dx.doi.org/10.1063/1.4882534 [20] S. Shoid, M. Rivaie, M. Mamat and I. Mohd, “Solving unconstrained optimization with a new type of conjugate gradient method”, AIP Conference Proceedings, 1602 (2014), 574-579. http://dx.doi.org/10.1063/1.4882542 [21] N. Shapiee, M. Rivaie. M. Mamat, and I. Mohd, “A new modification of Hestenes-Stiefel method with descent properties”, AIP Conference Proceedings, 1602 (2014), 520-526. http://dx.doi.org/10.1063/1.4882535 [22] M. Rivaie, A. Abashar, M. Mamat, I. Mohd, The Convergence Properties of a New Type of Conjugate Gradient Methods, Applied Math. Sci., 8 (1) (2014), 33-44. [23] J. Nocedal and S.J. Wright, (2006). Numerical Optimization: Springer Series in Optimization research. Springer-Vrelag, Berlin. http://dx.doi.org/10.1007/978-0-387-40065-5
originalfilename 6330-01-FH02-FIK-15-03433.jpg
person UniSZA
Unisza
unisza
recordtype oai_dc
resourceurl https://intelek.unisza.edu.my/intelek/pages/view.php?ref=12028
spelling 12028 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=12028 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 Restricted Document Article Journal UniSZA Unisza unisza image/jpeg inches 96 96 1403 10 10 786 2015-07-01 12:16:12 1403x786 6330-01-FH02-FIK-15-03433.jpg UniSZA Private Access Global convergence of a new spectral conjugate gradient by using strong wolfe line search Applied Mathematical Sciences Unconstrained optimization problems can be solved by using few popular methods such as Conjugate Gradient (CG) method, Steepest Descent (SD) Method and Broyden-Fletcher-Goldfarb-Shanno (BFGS) method. The simplest solving method is by using SD method but nowadays CG method is used worldwide due to its convergence analysis. A few of unconstrained optimization problems with several different variables are used to prove the global convergence result of new spectral conjugate gradient to be compared with five most common  k proposed by the early researches by using inexact line search. 9 61 HIKARI Ltd. HIKARI Ltd. 3105-3117 [1] J. Barzilai and J. M. Borwein, “Two-point step size gradient methods,” IMA Journal of Numerical Analysis, 8 (1) (1988), 141-148. http://dx.doi.org/10.1093/imanum/8.1.141 [2] M. Raydan, “The Barzilai and J. M. Borwein gradient method for the large scale unconstrained minimization problem,” SIAM Journal on Optimization, 7 (1) (1997), 26-33. http://dx.doi.org/10.1137/s1052623494266365 [3] M. Rivaie, M. Mamat, W.J. Leong, and I. Mohd. “A New Class of Nonlinear Conjugate Gradient Coefficients with Global Convergence Properties.” Applied Mathematics and Computations. 218 (2012), 11323-11332. http://dx.doi.org/10.1016/j.amc.2012.05.030 [4] R. Fletcher, Practical methods of optimization, Vol. 1, Unconstrained optimization Wiley, New York, 1987. [5] E. Polak, and G. Ribiere, Note sur la convergence de directions conjugees. Rev.Francaise Informat Recherche Operationelle, 3 (16) (1969), 35-43. [6] M. R. Hestenes and E. Steifel, Method of conjugate gradient for solving linear equations, J,Res.Nat.Bur.Stand., 49 (1952), 409-436. http://dx.doi.org/10.6028/jres.049.044 [7] R. Fletcher and C. Reeves, “Function Minimization by Conjugate Gradients”, Comput.J., 7 (1964), 149-154. http://dx.doi.org/10.1093/comjnl/7.2.149 [8] G. Zoutendijk, “Nonlinear Programming Computational Methods”, in Integer and Nonlinear Programming, J.Abadie (editor), 1970, 37-86. [9] M. J. D. Powell, Nonconvex Minimizations Calculations and the Conjugate Gradient Method, Lecture Notes in Mathematics, Berlin, Springer, 1066 (1984), 122-141. http://dx.doi.org/10.1007/bfb0099521 [10] M. J. D. Powell, Restart Procedures for the Conjugate Gradient Method, Mathematical Programming, 12 (1977), 241-254. http://dx.doi.org/10.1007/bf01593790 [11] L. Jinkiu and J. Youyi, “Global Convergence of a Spectral Conjugate Gradient Method for unconstrained Optimization”, Hindawi Publishing Corporation Abstract and Applied Analysis, Volume 2012, Article ID 758287, 12 pages. http://dx.doi.org/10.1155/2012/758287 [12] E. G. Birgin and J. M. Martinez, “A spectral conjugate gradient method for unconstrained optimization,” Applied Mathematics and Optimization, 43 (2), 117-128, 2001. http://dx.doi.org/10.1007/s00245-001-0003-0 [13] Wolfe, P. (1971). Convergence conditions for ascent method. II: some corrections. SIAM Rev., 13 (2), 185-188. http://dx.doi.org/10.1137/1013035 [14] M. Molga and C. Smutnicki, Test Functions for optimization needs, 2005. www.zsd.ict.pwr.wroc.pl/files/docs/functions.pdf [15] A. A. Goldstein, On steepest descent, SIAM J. Control. 3 (1965) 147–151. http://dx.doi.org/10.1137/0303013 [16] L. Armijo, Minimization of functions having Lipschitz continuous partial derivatives, Pacific J. Math. 16 (1966), 1–3. http://dx.doi.org/10.2140/pjm.1966.16.1 [17] P. Wolfe, Convergence conditions for ascent method, SIAM Rev. 11 (1969), 226–235. http://dx.doi.org/10.1137/1011036 [18] N. Andrei, An Unconstrained Optimization Test Functions Collection, Advanced Modeling and Optimization, 10 (1) (2008), 147-161. [19] N. H. M. Yussoff, M. Mamat, M. Rivaie and I. Mohd, “A new conjugate gradient method for unconstrained optimization with sufficient descent”, AIP Conference Proceedings, 1602 (2014), 514-519. http://dx.doi.org/10.1063/1.4882534 [20] S. Shoid, M. Rivaie, M. Mamat and I. Mohd, “Solving unconstrained optimization with a new type of conjugate gradient method”, AIP Conference Proceedings, 1602 (2014), 574-579. http://dx.doi.org/10.1063/1.4882542 [21] N. Shapiee, M. Rivaie. M. Mamat, and I. Mohd, “A new modification of Hestenes-Stiefel method with descent properties”, AIP Conference Proceedings, 1602 (2014), 520-526. http://dx.doi.org/10.1063/1.4882535 [22] M. Rivaie, A. Abashar, M. Mamat, I. Mohd, The Convergence Properties of a New Type of Conjugate Gradient Methods, Applied Math. Sci., 8 (1) (2014), 33-44. [23] J. Nocedal and S.J. Wright, (2006). Numerical Optimization: Springer Series in Optimization research. Springer-Vrelag, Berlin. http://dx.doi.org/10.1007/978-0-387-40065-5
spellingShingle Global convergence of a new spectral conjugate gradient by using strong wolfe line search
summary Unconstrained optimization problems can be solved by using few popular methods such as Conjugate Gradient (CG) method, Steepest Descent (SD) Method and Broyden-Fletcher-Goldfarb-Shanno (BFGS) method. The simplest solving method is by using SD method but nowadays CG method is used worldwide due to its convergence analysis. A few of unconstrained optimization problems with several different variables are used to prove the global convergence result of new spectral conjugate gradient to be compared with five most common  k proposed by the early researches by using inexact line search.
title Global convergence of a new spectral conjugate gradient by using strong wolfe line search
title_full Global convergence of a new spectral conjugate gradient by using strong wolfe line search
title_fullStr Global convergence of a new spectral conjugate gradient by using strong wolfe line search
title_full_unstemmed Global convergence of a new spectral conjugate gradient by using strong wolfe line search
title_short Global convergence of a new spectral conjugate gradient by using strong wolfe line search
title_sort global convergence of a new spectral conjugate gradient by using strong wolfe line search