A new conjugate gradient method with exact line search
| Format: | Restricted Document |
|---|
| _version_ | 1860797319087128576 |
|---|---|
| building | INTELEK Repository |
| collection | Online Access |
| collectionurl | https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 |
| date | 2015-08-16 10:27:41 |
| format | Restricted Document |
| id | 12230 |
| institution | UniSZA |
| internalnotes | [1] E. Polak, and G. Ribiere, Note sur la convergence de directions conjugees. Rev.Francaise Informat Recherche Operationelle, 3E Annee, 16 (1969), 35-43. [2] G. Zoutendijk, Nonlinear Programming Computational Methods, in Integer and Nonlinear Programming, J. Abadie (editor), (1970), 37-86. [3] M. Molga and C. Smutnicki, Test Functions for optimization needs, (2005). [4] M. R. Hestenes and E. Steifel, Method of conjugate gradient for solving linear equations, J,Res.Nat.Bur.Stand., 49 (1952), 409-436. http://dx.doi.org/10.6028/jres.049.044 [5] M. Rivaie, M. Mamat, W. J. Leong and M. Ismail, A new class of nonlinear Conjugate Gradient Coefficient with global convergence properties, Applied Mathematics and Computation, 218 (2012), 11323 – 11332. http://dx.doi.org/10.1016/j.amc.2012.05.030 [6] M. J. D. Powell, Nonconvex Minimizations Calculations and the Conjugate Gradient Method, Lecture Notes in Mathematics, Berlin, Springer, 1066 (1984), 122-141. http://dx.doi.org/10.1007/bfb0099521 [7] M. J. D. Powell, Restart Procedures for the Conjugate Gradient Method, Mathematical Programming, 12 (1977), 241-254. http://dx.doi.org/10.1007/bf01593790 [8] N. Andrei, An Unconstrained Optimization Test Function Collection, J. Adv. Modeling and Optimization, 10 (2008), 147-161. [9] R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput.J., 7 (1964), 149-154. http://dx.doi.org/10.1093/comjnl/7.2.149 [10] A. Y. Al-Bayati, and R. Z. Al-Kawaz, A new hybrid WC-FR conjugate gradient algorithm with modified secant condition for unconstrained optimization. J. Math. Comp. Sci., 2 (2012), 937-966. [11] Y. H. Dai, J. Y. Han, G. H. Liu, D. F. Sun, X. Yin, and Y. Yuan, (Convergence properties of nonlinear conjugate gradient method. SIAM J. Optim., 10 (1999), 348-358. http://dx.doi.org/10.1137/s1052623494268443 [12] P. Wolfe, Convergence conditions for ascent method. II: some corrections. SIAM Rev., 13 (2) (1971), 185-188. http://dx.doi.org/10.1137/1013035 [13] Z. J. Shi, Convergence of line search methods for unconstrained optimization. Applied Mathematics and Computation., 157 (2004), 393-405. http://dx.doi.org/10.1016/j.amc.2003.08.058 [14] W. Sun, and Y. X. Yuan, Optimization theory and method (nonlinear programming). Springer Science and Business Media, LLC (2006). [15] Z. J. S. Shi, and J. Shen, On step-size estimation of line search methods. Applied Mathematics and Computation., 173 (2006), 360-371. http://dx.doi.org/10.1016/j.amc.2005.04.039 [16] N. Shapiee, R. M. Mamat, and I. Mohd, A new modification of HestenesStiefel method with descent properties, AIP Conference Proceedings, 1602 (2014), 520-526. http://dx.doi.org/10.1063/1.4882535 [17] M. Rivaie, A. Abashar, M. Mamat and I. Mohd, The convergence properties of a new type of conjugate gradient methods, Applied Mathematical Sciences, 8 (2014), 33-44. http://dx.doi.org/10.12988/ams.2014.310578 [18] N. H. M. Yussoff, M. Mamat, M. Rivaie and I. Mohd, A new conjugate gradient method for unconstrained optimization with sufficient descent, AIP Conference Proceedings, 1602 (2014), 514-519. http://dx.doi.org/10.1063/1.4882534 [19] Y. Dai and Y. Yuan, Nonlinear conjugate gradient method, Shanghai Scientific and Technical Publisher, Beijing (1998). [20] Y. Yuan and W. Sun, Theory and methods of optimization, Science Press of China, Beijing (1999). [21] A. Abashar, M. Mamat, M. Rivaie and I. Mohd, Global convergence properties of a new class of conjugate gradient method for unconstrained optimization, Applied Mathematics Sciences, 8 (67) (2014), 3307-3319. http://dx.doi.org/10.12988/ams.2014.43246 [22] S. Shoid, M. Rivaie, M. Mamat and I. Mohd, Solving unconstrained optimization with a new type of conjugate gradient method, AIP Conference Proceedings,1602 (2014), 574-579. http://dx.doi.org/10.1063/1.4882542 [23] A. Abashar, M. Mamat, M. Rivaie, I. Mohd and O. Omer, The proof of sufficient descent condition for a new type of conjugate gradient methods, AIP Conference Proceedings, 1602 (2014), 296-303. http://dx.doi.org/10.1063/1.4882502 |
| originalfilename | 6530-01-FH02-FIK-15-03666.jpg |
| person | UniSZA Unisza unisza |
| recordtype | oai_dc |
| resourceurl | https://intelek.unisza.edu.my/intelek/pages/view.php?ref=12230 |
| spelling | 12230 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=12230 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 Restricted Document Article Journal UniSZA Unisza unisza image/jpeg inches 96 96 1416 39 39 2015-08-16 10:27:41 684 1416x684 6530-01-FH02-FIK-15-03666.jpg UniSZA Private Access A new conjugate gradient method with exact line search Applied Mathematical Sciences Conjugate gradient (CG) methods have been practically used to solve large-scale unconstrained optimization problems due to their simplicity and low memory storage. In this paper, we proposed a new type of CG coefficients ( ) k . The k is computed as an average between two different types of method which are Polak and Ribiere (PR) and Norrlaili et al. (NRMI). Numerical comparisons are made with the five others k proposed by the early researches. A set of eight unconstrained optimization problems with several different variables are used in this paper. It is shown that, the new proposed k with an exact line search is possessed global convergence properties. Numerical results also show that this new k outperforms some of these CG methods. 9 93 HIKARI Ltd. HIKARI Ltd. 4799-4812 http://dx.doi.org/10.12988/ams.2015.53243 [1] E. Polak, and G. Ribiere, Note sur la convergence de directions conjugees. Rev.Francaise Informat Recherche Operationelle, 3E Annee, 16 (1969), 35-43. [2] G. Zoutendijk, Nonlinear Programming Computational Methods, in Integer and Nonlinear Programming, J. Abadie (editor), (1970), 37-86. [3] M. Molga and C. Smutnicki, Test Functions for optimization needs, (2005). [4] M. R. Hestenes and E. Steifel, Method of conjugate gradient for solving linear equations, J,Res.Nat.Bur.Stand., 49 (1952), 409-436. http://dx.doi.org/10.6028/jres.049.044 [5] M. Rivaie, M. Mamat, W. J. Leong and M. Ismail, A new class of nonlinear Conjugate Gradient Coefficient with global convergence properties, Applied Mathematics and Computation, 218 (2012), 11323 – 11332. http://dx.doi.org/10.1016/j.amc.2012.05.030 [6] M. J. D. Powell, Nonconvex Minimizations Calculations and the Conjugate Gradient Method, Lecture Notes in Mathematics, Berlin, Springer, 1066 (1984), 122-141. http://dx.doi.org/10.1007/bfb0099521 [7] M. J. D. Powell, Restart Procedures for the Conjugate Gradient Method, Mathematical Programming, 12 (1977), 241-254. http://dx.doi.org/10.1007/bf01593790 [8] N. Andrei, An Unconstrained Optimization Test Function Collection, J. Adv. Modeling and Optimization, 10 (2008), 147-161. [9] R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput.J., 7 (1964), 149-154. http://dx.doi.org/10.1093/comjnl/7.2.149 [10] A. Y. Al-Bayati, and R. Z. Al-Kawaz, A new hybrid WC-FR conjugate gradient algorithm with modified secant condition for unconstrained optimization. J. Math. Comp. Sci., 2 (2012), 937-966. [11] Y. H. Dai, J. Y. Han, G. H. Liu, D. F. Sun, X. Yin, and Y. Yuan, (Convergence properties of nonlinear conjugate gradient method. SIAM J. Optim., 10 (1999), 348-358. http://dx.doi.org/10.1137/s1052623494268443 [12] P. Wolfe, Convergence conditions for ascent method. II: some corrections. SIAM Rev., 13 (2) (1971), 185-188. http://dx.doi.org/10.1137/1013035 [13] Z. J. Shi, Convergence of line search methods for unconstrained optimization. Applied Mathematics and Computation., 157 (2004), 393-405. http://dx.doi.org/10.1016/j.amc.2003.08.058 [14] W. Sun, and Y. X. Yuan, Optimization theory and method (nonlinear programming). Springer Science and Business Media, LLC (2006). [15] Z. J. S. Shi, and J. Shen, On step-size estimation of line search methods. Applied Mathematics and Computation., 173 (2006), 360-371. http://dx.doi.org/10.1016/j.amc.2005.04.039 [16] N. Shapiee, R. M. Mamat, and I. Mohd, A new modification of HestenesStiefel method with descent properties, AIP Conference Proceedings, 1602 (2014), 520-526. http://dx.doi.org/10.1063/1.4882535 [17] M. Rivaie, A. Abashar, M. Mamat and I. Mohd, The convergence properties of a new type of conjugate gradient methods, Applied Mathematical Sciences, 8 (2014), 33-44. http://dx.doi.org/10.12988/ams.2014.310578 [18] N. H. M. Yussoff, M. Mamat, M. Rivaie and I. Mohd, A new conjugate gradient method for unconstrained optimization with sufficient descent, AIP Conference Proceedings, 1602 (2014), 514-519. http://dx.doi.org/10.1063/1.4882534 [19] Y. Dai and Y. Yuan, Nonlinear conjugate gradient method, Shanghai Scientific and Technical Publisher, Beijing (1998). [20] Y. Yuan and W. Sun, Theory and methods of optimization, Science Press of China, Beijing (1999). [21] A. Abashar, M. Mamat, M. Rivaie and I. Mohd, Global convergence properties of a new class of conjugate gradient method for unconstrained optimization, Applied Mathematics Sciences, 8 (67) (2014), 3307-3319. http://dx.doi.org/10.12988/ams.2014.43246 [22] S. Shoid, M. Rivaie, M. Mamat and I. Mohd, Solving unconstrained optimization with a new type of conjugate gradient method, AIP Conference Proceedings,1602 (2014), 574-579. http://dx.doi.org/10.1063/1.4882542 [23] A. Abashar, M. Mamat, M. Rivaie, I. Mohd and O. Omer, The proof of sufficient descent condition for a new type of conjugate gradient methods, AIP Conference Proceedings, 1602 (2014), 296-303. http://dx.doi.org/10.1063/1.4882502 |
| spellingShingle | A new conjugate gradient method with exact line search |
| summary | Conjugate gradient (CG) methods have been practically used to solve large-scale unconstrained optimization problems due to their simplicity and low memory storage. In this paper, we proposed a new type of CG coefficients ( ) k . The k is computed as an average between two different types of method which are Polak and Ribiere (PR) and Norrlaili et al. (NRMI). Numerical comparisons are made with the five others k proposed by the early researches. A set of eight unconstrained optimization problems with several different variables are used in this paper. It is shown that, the new proposed k with an exact line search is possessed global convergence properties. Numerical results also show that this new k outperforms some of these CG methods. |
| title | A new conjugate gradient method with exact line search |
| title_full | A new conjugate gradient method with exact line search |
| title_fullStr | A new conjugate gradient method with exact line search |
| title_full_unstemmed | A new conjugate gradient method with exact line search |
| title_short | A new conjugate gradient method with exact line search |
| title_sort | new conjugate gradient method with exact line search |