A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
| Format: | Restricted Document |
|---|
| _version_ | 1860797321224126464 |
|---|---|
| building | INTELEK Repository |
| collection | Online Access |
| collectionurl | https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 |
| date | 2015-09-02 10:36:27 |
| format | Restricted Document |
| id | 12240 |
| institution | UniSZA |
| internalnotes | [1] A.A. Goldstein, On steepest descent, SIAM J. Control 3 (1965) 147–151. [2] D. Touati-Ahmed, C. Storey, Efficient hybrid conjugate gradient techniques, J. Optim. Theory Appl 64 (1990) 379–397. [3] E.G. Birgin, J.M. Martinez, A spectral conjugate gradient method for unconstrained optimization, J. Appl. Maths. Optim 43 (2001) 117–128. [4] E. Dolan, J.J. More, Benchmarking optimization software with performance profile, Maths. Prog 91 (2002) 201–213. [5] E. Polak, G. Ribiere, Note sur la convergence de directions conjugees, Rev. Francaise Informat Recherche Operationelle 3 (1969) 35–43. [6] G. Yuan, Z. Wei, New line search methods for unconstrained optimization, J. Korean Stat. Soc. 38 (2009) 29–39. [7] G. Yuan, S. Lu, Z. Wei, A line search algorithm for unconstrained optimization, J. Soft. Eng. App 3 (2010) 503–509. [8] G. Yuan, X. Lu, Z. Wei, A conjugate gradient method with descent direction for unconstrained optimization, J. Comp. App. Maths. 233 (2009) 519–530. [9] G. Zoutendijk, Nonlinear programming computational methods, in: J. Abadie (Ed.), Integer and nonlinear programming, North Holland, Amsterdam, 1970. [10] J.C. Gilbert, J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim. 2 (1992) 21–42. [11] J. Sun, J. Zhang, Global convergence of conjugate gradient methods without line search, Annals. Opr. Rch. 103 (2001) 161–173. [12] K.E. Hilstrom, A simulation test approach to the evaluation of nonlinear optimization algorithms, A.C.M. Trans. Maths. Softw 3 (1977) 305–315. [13] L. Armijo, Minimization of functions having Lipschitz continuous partial derivatives, Pacific J. Math. 16 (1966) 1–3. [14] M. Al-Baali, Descent property and global convergence of Fletcher-Reeves method with inexact line search, IMA. J. Numer. Anal. 5 (1985) 121–124. [15] M.J.D. Powell, Restart procedures for the conjugate gradient method, Math. Program 12 (1977) 241–254. [16] M.J.D. Powell, Nonconvex minimization calculations and the conjugate gradient method in Lecture notes in mathematics, 1066, Springer-Verlag, Berlin,1984;, pp. 122–141. [17] M.J.D. Powell, Convergence properties of algorithm for nonlinear optimization, SIAM Rev 28 (1986) 487–500. [18] M. Rivaie, M. Mustafa, M. Ismail, M. Fauzi, A comparative study of conjugate gradient coefficient for unconstrained optimization, Aus. J. Bas. Appl. Sc.5(2011) 947–951. [19] M. Rivaie, M. Mustafa, L.W. June, I. Mohd, A new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl. Math. Comp.218 (2012) 11323–11332. [20] M.R. Hestenes, E. Steifel, Method of conjugate gradient for solving linear equations, J. Res. Nat. Bur. Stand 49 (1952) 409–436. [21] N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim 10 (2008) 147–161. [22] N. Andrei, Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization, Bull. Malays. Math. Sci. Soc. (2) 34 (2) (2011) 319–330. [23] N. Andrei, Accelerated conjugate gradientalgorithm with finite difference Hessian/vector product approximation for unconstrained optimization,J.Comput.Appl. Math. 230 (2009) 570–582. [24] N. Andrei, 40 conjugate gradients algorithms for unconstrained optimization, Bull. Malays. Math. Sci. Soc. 34 (2011) 319–330. [25] P. Wolfe, Convergence conditions for ascent method, SIAM Rev 11 (1969) 226–235. [26] R. Fletcher, Practical method of optimization, Unconstrained optimization, 1, John Wiley & Sons, New York, 1987. [27] R. Fletcher, C. Reeves, Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154. [28] W.W. Hager, H.C. Zhang, A new conjugate gradient method with guaranteed descent and efficient line search, SIAM J. Optim. 16 (2005) 170–192. [29] Y.H. Dai, Nonlinear conjugate gradient methods, Wiley Encyclopedia of Operation Research, New York, 2011. [30] Y.H. Dai, C.X. Kou, A nonlinear conjugate gradient method with an optimal property and an improved Wolfe line search, SIAM J. Optim. 23 (1) (2013)296–320. [31] Y.H. Dai, Y. Yuan, Nonlinear Conjugate Gradient Method, Shanghai Scientific and Technical Publishers, Beijing, 1998. [32] Y.H. Dai, Y. Yuan, A nonlinear conjugate gradient with a strong global convergence property, SIAM J. Optim. 10 (1999) 177–182. [33] Y.H. Dai, Y. Yuan, A note on the nonlinear conjugate gradient method, J. Comput. Appl. Math. 18 (6) (2002) 575–582. [34] Y. Liu, C. Storey, Efficient generalized conjugate gradient algorithms part 1: Theory, J. Comput. Appl. Math. 69 (1992) 129–137. [35] Y. Yuan, W. Sun, Theory and Methods of Optimization, Science Press of China, Beijing, 1999. [36] Z.F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Anal. 74 (2011) 927–936. [37] Z.J. Shi, J. Guo, A new family of conjugate gradient methods, J. Comput. Appl. Math. 224 (2009) 44 4–457. [38] Z.J. Shi, S. Wang, Z. Xu, The convergence of conjugate gradient method with nonmonotone line search, Appl. Math. Comp. 217 (2010) 1921–1932. [39] Z. Wei, G. Li, L. Qi, New nonlinear conjugate gradient method formulas for large- scaled unconstrained optimizations problems, Appl. Math. Comp. 179(2006) 407–430. |
| originalfilename | 6540-01-FH02-FIK-15-03708.jpg |
| person | UniSZA Unisza unisza |
| recordtype | oai_dc |
| resourceurl | https://intelek.unisza.edu.my/intelek/pages/view.php?ref=12240 |
| spelling | 12240 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=12240 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 Restricted Document Article Journal UniSZA Unisza unisza image/jpeg inches 96 96 1405 789 75 75 2015-09-02 10:36:27 1405x789 6540-01-FH02-FIK-15-03708.jpg UniSZA Private Access A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches Elsevier Inc. Conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. In this paper, we propose a new family of CG coefficients (βk) that possess sufficient descent conditions and global convergence properties. This new βk is an extension of the already proven βkRMIL from Rivaie et al. [19] (A new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl. Math. Comp. 218(2012) 11323-11332). Global convergence result is established using both exact and inexact line searches. Numerical results show that the performance of the new proposed formula is quite similar to βkRMIL and suited to both line searches. Importantly, the performance of this βk is more efficient and superior than the other well-known βk. 268 Elsevier Inc. Elsevier Inc. 1152-1163 [1] A.A. Goldstein, On steepest descent, SIAM J. Control 3 (1965) 147–151. [2] D. Touati-Ahmed, C. Storey, Efficient hybrid conjugate gradient techniques, J. Optim. Theory Appl 64 (1990) 379–397. [3] E.G. Birgin, J.M. Martinez, A spectral conjugate gradient method for unconstrained optimization, J. Appl. Maths. Optim 43 (2001) 117–128. [4] E. Dolan, J.J. More, Benchmarking optimization software with performance profile, Maths. Prog 91 (2002) 201–213. [5] E. Polak, G. Ribiere, Note sur la convergence de directions conjugees, Rev. Francaise Informat Recherche Operationelle 3 (1969) 35–43. [6] G. Yuan, Z. Wei, New line search methods for unconstrained optimization, J. Korean Stat. Soc. 38 (2009) 29–39. [7] G. Yuan, S. Lu, Z. Wei, A line search algorithm for unconstrained optimization, J. Soft. Eng. App 3 (2010) 503–509. [8] G. Yuan, X. Lu, Z. Wei, A conjugate gradient method with descent direction for unconstrained optimization, J. Comp. App. Maths. 233 (2009) 519–530. [9] G. Zoutendijk, Nonlinear programming computational methods, in: J. Abadie (Ed.), Integer and nonlinear programming, North Holland, Amsterdam, 1970. [10] J.C. Gilbert, J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim. 2 (1992) 21–42. [11] J. Sun, J. Zhang, Global convergence of conjugate gradient methods without line search, Annals. Opr. Rch. 103 (2001) 161–173. [12] K.E. Hilstrom, A simulation test approach to the evaluation of nonlinear optimization algorithms, A.C.M. Trans. Maths. Softw 3 (1977) 305–315. [13] L. Armijo, Minimization of functions having Lipschitz continuous partial derivatives, Pacific J. Math. 16 (1966) 1–3. [14] M. Al-Baali, Descent property and global convergence of Fletcher-Reeves method with inexact line search, IMA. J. Numer. Anal. 5 (1985) 121–124. [15] M.J.D. Powell, Restart procedures for the conjugate gradient method, Math. Program 12 (1977) 241–254. [16] M.J.D. Powell, Nonconvex minimization calculations and the conjugate gradient method in Lecture notes in mathematics, 1066, Springer-Verlag, Berlin,1984;, pp. 122–141. [17] M.J.D. Powell, Convergence properties of algorithm for nonlinear optimization, SIAM Rev 28 (1986) 487–500. [18] M. Rivaie, M. Mustafa, M. Ismail, M. Fauzi, A comparative study of conjugate gradient coefficient for unconstrained optimization, Aus. J. Bas. Appl. Sc.5(2011) 947–951. [19] M. Rivaie, M. Mustafa, L.W. June, I. Mohd, A new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl. Math. Comp.218 (2012) 11323–11332. [20] M.R. Hestenes, E. Steifel, Method of conjugate gradient for solving linear equations, J. Res. Nat. Bur. Stand 49 (1952) 409–436. [21] N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim 10 (2008) 147–161. [22] N. Andrei, Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization, Bull. Malays. Math. Sci. Soc. (2) 34 (2) (2011) 319–330. [23] N. Andrei, Accelerated conjugate gradientalgorithm with finite difference Hessian/vector product approximation for unconstrained optimization,J.Comput.Appl. Math. 230 (2009) 570–582. [24] N. Andrei, 40 conjugate gradients algorithms for unconstrained optimization, Bull. Malays. Math. Sci. Soc. 34 (2011) 319–330. [25] P. Wolfe, Convergence conditions for ascent method, SIAM Rev 11 (1969) 226–235. [26] R. Fletcher, Practical method of optimization, Unconstrained optimization, 1, John Wiley & Sons, New York, 1987. [27] R. Fletcher, C. Reeves, Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154. [28] W.W. Hager, H.C. Zhang, A new conjugate gradient method with guaranteed descent and efficient line search, SIAM J. Optim. 16 (2005) 170–192. [29] Y.H. Dai, Nonlinear conjugate gradient methods, Wiley Encyclopedia of Operation Research, New York, 2011. [30] Y.H. Dai, C.X. Kou, A nonlinear conjugate gradient method with an optimal property and an improved Wolfe line search, SIAM J. Optim. 23 (1) (2013)296–320. [31] Y.H. Dai, Y. Yuan, Nonlinear Conjugate Gradient Method, Shanghai Scientific and Technical Publishers, Beijing, 1998. [32] Y.H. Dai, Y. Yuan, A nonlinear conjugate gradient with a strong global convergence property, SIAM J. Optim. 10 (1999) 177–182. [33] Y.H. Dai, Y. Yuan, A note on the nonlinear conjugate gradient method, J. Comput. Appl. Math. 18 (6) (2002) 575–582. [34] Y. Liu, C. Storey, Efficient generalized conjugate gradient algorithms part 1: Theory, J. Comput. Appl. Math. 69 (1992) 129–137. [35] Y. Yuan, W. Sun, Theory and Methods of Optimization, Science Press of China, Beijing, 1999. [36] Z.F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Anal. 74 (2011) 927–936. [37] Z.J. Shi, J. Guo, A new family of conjugate gradient methods, J. Comput. Appl. Math. 224 (2009) 44 4–457. [38] Z.J. Shi, S. Wang, Z. Xu, The convergence of conjugate gradient method with nonmonotone line search, Appl. Math. Comp. 217 (2010) 1921–1932. [39] Z. Wei, G. Li, L. Qi, New nonlinear conjugate gradient method formulas for large- scaled unconstrained optimizations problems, Appl. Math. Comp. 179(2006) 407–430. |
| spellingShingle | A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches |
| summary | Conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. In this paper, we propose a new family of CG coefficients (βk) that possess sufficient descent conditions and global convergence properties. This new βk is an extension of the already proven βkRMIL from Rivaie et al. [19] (A new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl. Math. Comp. 218(2012) 11323-11332). Global convergence result is established using both exact and inexact line searches. Numerical results show that the performance of the new proposed formula is quite similar to βkRMIL and suited to both line searches. Importantly, the performance of this βk is more efficient and superior than the other well-known βk. |
| title | A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches |
| title_full | A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches |
| title_fullStr | A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches |
| title_full_unstemmed | A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches |
| title_short | A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches |
| title_sort | new class of nonlinear conjugate gradient coefficients with exact and inexact line searches |