Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
RAIRO - Operations Research - Recherche Opérationnelle, Tome 51 (2017) no. 1, pp. 67-77.

Descent condition is a crucial factor to establish the global convergence of nonlinear conjugate gradient method. In this paper, we propose some modified Yabe–Takano conjugate gradient methods, in which the corresponding search directions always satisfy the sufficient descent property independently of the convexity of the objective function. Differently from the existent methods, a new update strategy in constructing the search direction is proposed to establish the global convergence of the presented methods for the general nonconvex objective function. Numerical results illustrate that our methods can efficiently solve the test problems and therefore is promising.

DOI : 10.1051/ro/2016028
Classification : 49M37, 65K05, 90C53
Mots-clés : Yabe–Takano conjugate gradient method, global convergence, sufficient descent condition, conjugacy condition, numerical comparison
Dong, Xiao Liang 1 ; Li, Wei Jun 2 ; He, Yu Bo 3

1 School of Mathematics and Information science, Beifang University of Nationalities, No. 204 North Wenchang Rd,Yinchuan, Ningxia 750021, P.R. China.
2 Network Information Technology Center, Beifang University of Nationalities, No. 204 North Wenchang Rd, Yinchuan, Ningxia 750021, P.R. China.
3 Department of Mathematics and Applied Mathematics, Huaihua University, No. 612 Yingfeng East Road, Huaihua, Hunan 418008, P.R. China.
@article{RO_2017__51_1_67_0,
     author = {Dong, Xiao Liang and Li, Wei Jun and He, Yu Bo},
     title = {Some modified {Yabe{\textendash}Takano} conjugate gradient methods with sufficient descent condition},
     journal = {RAIRO - Operations Research - Recherche Op\'erationnelle},
     pages = {67--77},
     publisher = {EDP-Sciences},
     volume = {51},
     number = {1},
     year = {2017},
     doi = {10.1051/ro/2016028},
     zbl = {1358.49027},
     mrnumber = {3589264},
     language = {en},
     url = {http://archive.numdam.org/articles/10.1051/ro/2016028/}
}
TY  - JOUR
AU  - Dong, Xiao Liang
AU  - Li, Wei Jun
AU  - He, Yu Bo
TI  - Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
JO  - RAIRO - Operations Research - Recherche Opérationnelle
PY  - 2017
SP  - 67
EP  - 77
VL  - 51
IS  - 1
PB  - EDP-Sciences
UR  - http://archive.numdam.org/articles/10.1051/ro/2016028/
DO  - 10.1051/ro/2016028
LA  - en
ID  - RO_2017__51_1_67_0
ER  - 
%0 Journal Article
%A Dong, Xiao Liang
%A Li, Wei Jun
%A He, Yu Bo
%T Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
%J RAIRO - Operations Research - Recherche Opérationnelle
%D 2017
%P 67-77
%V 51
%N 1
%I EDP-Sciences
%U http://archive.numdam.org/articles/10.1051/ro/2016028/
%R 10.1051/ro/2016028
%G en
%F RO_2017__51_1_67_0
Dong, Xiao Liang; Li, Wei Jun; He, Yu Bo. Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition. RAIRO - Operations Research - Recherche Opérationnelle, Tome 51 (2017) no. 1, pp. 67-77. doi : 10.1051/ro/2016028. http://archive.numdam.org/articles/10.1051/ro/2016028/

N. Andrei, Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34 (2011) 319–330. | MR | Zbl

N. Andrei, A simple three–term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241 (2013) 19–29. | DOI | MR | Zbl

N. Andrei, On three–term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219 (2013) 6316–6327. | MR | Zbl

S. Babaie-Kafaki, R. Ghanbari and N. Mahdavi-Amiri, Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234 (2010) 1374–1386. | DOI | MR | Zbl

K.E. Bongartz, A.R. Conn And and N.I.M. Gould, P.L. Toint, CUTE: constrained and unconstrained testing environments, ACM Trans. Math. Software 21 (1995) 123–160. | DOI | Zbl

W.Y. Cheng and D.H. Li, An active set modified Polak–Ribière–Polyak method for large–scale nonlinear bound constrained optimization. J. Optim. Theory Appl. 155 (2012) 1084–1094. | DOI | MR | Zbl

Y. Dai and C. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23 (2013) 296–320. | DOI | MR | Zbl

Y. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10 (1999) 177–182. | DOI | MR | Zbl

Y. Dai and L. Liao. New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43 (2001) 87–101. | DOI | MR | Zbl

E.D. Dolan and J.J. Moré, Benchmarking optimization software with performance profiles. Math. Program. Ser. A 91 (2002) 201–213. | DOI | MR | Zbl

X. Dong, Comment on “A new three–term conjugate gradient method for unconstrained problem”. Numer. Algor. 72 (2016) 173–179. | DOI | MR | Zbl

X. Dong, H. Liu and Y. He, A self–adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 165 (2015) 225–241. | DOI | MR | Zbl

X. Dong, H. Liu and Y. He, New version of three–term conjugate gradient method based on spectral scaling conjugacy condition that generates descent direction. Appl.Math. Comput. 239 (2015) 1606–1617. | MR

X. Dong, H. Liu, Y. He, S. Babaie-Kafakid and R. Ghanbari, A new three-term conjugate gradient method with descent direction for unconstrained optimization.Math. Model. Anal. (to appear). | MR

X. Dong, H. Liu, Y. He, Y. Xu and X. Yang, Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence. Optim. Lett. 9 (2015) 1421–1432. | DOI | MR | Zbl

X. Dong, H. Liu, Y. He and X. Yang, A modified Hestenes–Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition. J. Comput. Appl. Math. 281 (2015) 239–249. | DOI | MR | Zbl

R. Fletcher, Practical methods of optimization, John Wiley and Sons (2013). | MR | Zbl

R. Fletcher and C. Reeves, Function minimization by conjugate gradients. Comput. J. 7 (1964) 149–154. | DOI | MR | Zbl

J.A. Ford and I.A. Moghrabi, Multi-step quasi-Newton methods for optimization. J. Comput. Appl. Math. 50 (1994) 305–323. | DOI | MR | Zbl

J.A. Ford and I.A. Moghrabi, Using function–values in Multi-step quasi-Newton methods. J. Comput. Appl. Math. 66 (1996) 201–211. | DOI | MR | Zbl

J.A. Ford, Y. Narushima, and H. Yabe, Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40 (2008) 191–216. | DOI | MR | Zbl

J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2 (1992) 21–42. | DOI | MR | Zbl

W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16 (2005) 170–192. | DOI | MR | Zbl

W.W. Hager and H. Zhang, A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2 (2006) 35–58. | MR | Zbl

M.R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49 (1952) 409–436. | DOI | MR | Zbl

D.H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129 (2001) 15–35. | DOI | MR | Zbl

D.H. Li and M. Fukushima, On the global convergence of the BFGS method for nonconvex unconstrained optimization problems. SIAM J. Optim. 11 (2001) 1054–1064. | DOI | MR | Zbl

G. Li, C. Tang, and Z.Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202 (2007) 523–539. | DOI | MR | Zbl

Y.L. Liu and C.S. Storey, Efficient generalized conjugate gradient algorithms. Part 1: Theory, J. Optim. Theory Appl. 69 (1991) 129–137. | MR | Zbl

Y. Narushima and H. Yabe. Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization. J. Comput. Appl. Math. 236 (2012) 4303–4317. | DOI | MR | Zbl

Y. Narushima, H. Yabe and J.A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21 (2011) 212–230. | DOI | MR | Zbl

Z. Wei, G. Li and L. Qi, New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175 (2006) 1156–1188. | MR | Zbl

F. Wen, Z. He, Z. Dai and X. Yang, Characteristics of investors’ risk preference for stock markets. Econ. Comput. Econ. Cybern. Stud. Res. 3 (2014) 235–254.

P. Wolfe, Convergence conditions for ascent methods. SIAM Rev. 11 (1969) 226–235. | DOI | MR | Zbl

H. Yabe and M. Takano, Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28 (2004) 203–225. | DOI | MR | Zbl

G. Yu and L. Guan, A descent spectral conjugate gradient method for inpulse noise removal. Appl. Math. Lett. 2 (2010) 555–560. | DOI | MR | Zbl

Y. Yuan, A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11 (1991) 325–332. | DOI | MR | Zbl

Y. Yuan and R.H. Byrd, Non-quasi-Newton updates for unconstrained optimization. J. Comput. Math. 13 (1995) 95–107. | MR | Zbl

J.Z. Zhang, N.Y. Deng and L.H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl. 102 (1999) 147–167. | DOI | MR | Zbl

J. Zhang and C. Xu, Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137 (2001) 269–278. | DOI | MR | Zbl

L. Zhang, W. Zhou and D. Li, A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 6 (2006) 629–640. | DOI | MR | Zbl

X. Zheng, H. Liu and A. Lu, Sufficient descent conjugate gradient methods for large-scale optimization problems. Int. J. Comput Math. 88 (2011) 3436–3447. | DOI | MR | Zbl

W. Zhou and L. Zhang, A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21 (2006) 707–714. | DOI | MR | Zbl

Cité par Sources :