Previous |  Up |  Next

Article

Title: A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization (English)
Author: Kim, Yongjin
Author: Jong, Yunchol
Author: Kim, Yong
Language: English
Journal: Applications of Mathematics
ISSN: 0862-7940 (print)
ISSN: 1572-9109 (online)
Volume: 69
Issue: 6
Year: 2024
Pages: 847-866
Summary lang: English
.
Category: math
.
Summary: Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported. (English)
Keyword: unconstrained optimization
Keyword: conjugate gradient method
Keyword: multi-step secant condition
Keyword: self-scaling
Keyword: improved Wolfe line search
MSC: 65K05
MSC: 90C06
DOI: 10.21136/AM.2024.0204-23
.
Date available: 2024-12-13T19:02:29Z
Last updated: 2024-12-16
Stable URL: http://hdl.handle.net/10338.dmlcz/152672
.
Reference: [1] Chen, Y., Cao, M., Yang, Y.: A new accelerated conjugate gradient method for large-scale unconstrained optimization.J. Inequal. Appl. 2019 (2019), Article ID 300, 13 pages. Zbl 1499.90216, MR 4036512, 10.1186/s13660-019-2238-9
Reference: [2] Dai, Y.-H., Kou, C.-X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search.SIAM J. Optim. 23 (2013), 296-320. Zbl 1266.49065, MR 3033109, 10.1137/100813026
Reference: [3] Dai, Y.-H., Liao, L.-Z.: New conjugacy conditions and related nonlinear conjugate gradient methods.Appl. Math. Optim. 43 (2001), 87-101. Zbl 0973.65050, MR 1804396, 10.1007/s002450010019
Reference: [4] Dolan, E. D., Moré, J. J.: Benchmarking optimization software with performance profiles.Math. Program. 91 (2002), 201-213. Zbl 1049.90004, MR 1875515, 10.1007/s101070100263
Reference: [5] Dong, X.-L., Dai, Z.-F., Ghanbari, R., Li, X.-L.: An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition.J. Oper. Res. Soc. China 9 (2021), 411-425. Zbl 1488.49058, MR 4263215, 10.1007/s40305-019-00257-w
Reference: [6] Dong, X., Han, D., Dai, Z., Li, L., Zhu, J.: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition.J. Optim. Theory Appl. 179 (2018), 944-961. Zbl 1402.90176, MR 3869490, 10.1007/s10957-018-1377-3
Reference: [7] Ford, J. A., Moghrabi, I. A.: Alternative parameter choices for multi-step quasi-Newton methods.Optim. Methods Softw. 2 (1993), 357-370. MR 1284271, 10.1080/10556789308805550
Reference: [8] Ford, J. A., Moghrabi, I. A.: Multi-step quasi-Newton methods for optimization.J. Comput. Appl. Math. 50 (1994), 305-323. Zbl 0807.65062, MR 1284271, 10.1016/0377-0427(94)90309-3
Reference: [9] Ford, J. A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization.Comput. Optim. Appl. 40 (2008), 191-216. Zbl 1181.90221, MR 2390824, 10.1007/s10589-007-9087-z
Reference: [10] Hager, W. W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search.SIAM J. Optim. 16 (2005), 170-192. Zbl 1093.90085, MR 2177774, 10.1137/030601880
Reference: [11] Kou, C. X., Dai, Y. H.: A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization.J. Optim. Theory Appl. 165 (2015), 209-224. Zbl 1319.49042, MR 3327422, 10.1007/s10957-014-0528-4
Reference: [12] Li, D., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization.J. Comput. Appl. Math. 129 (2001), 15-35. Zbl 0984.65055, MR 1823208, 10.1016/S0377-0427(00)00540-9
Reference: [13] Li, X., Shi, J., Dong, X., Yu, J.: A new conjugate gradient method based on quasi-Newton equation for unconstrained optimization.J. Comput. Appl. Math. 350 (2019), 372-379. Zbl 1524.90295, MR 3877400, 10.1016/j.cam.2018.10.035
Reference: [14] Mtagulwa, P., Kaelo, P.: An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems.Appl. Numer. Math. 145 (2019), 111-120. Zbl 1477.65095, MR 3963704, 10.1016/j.apnum.2019.06.003
Reference: [15] Narushima, Y., Yabe, H., Ford, J. A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization.SIAM J. Optim. 21 (2011), 212-230. Zbl 1250.90087, MR 2765496, 10.1137/080743573
Reference: [16] Nocedal, J., Wright, S. J.: Numerical Optimization.Springer Series in Operations Research and Financial Engineering. Springer, New York (2006). Zbl 1104.65059, MR 2244940, 10.1007/978-0-387-40065-5
Reference: [17] Perry, A.: A class of conjugate gradient algorithms with a two-step variable metric memory.Discussion paper 269 (1977), 16 pages Available at https://EconPapers.repec.org/RePEc:nwu:cmsems:269\kern0pt.
Reference: [18] Shanno, D. F.: On the convergence of a new conjugate gradient algorithm.SIAM J. Numer. Anal. 15 (1978), 1247-1257. Zbl 0408.90071, MR 0512697, 10.1137/0715085
Reference: [19] Wei, Z., Li, G., Qi, L.: New quasi-Newton methods for unconstrained optimization problems.Appl. Math. Comput. 175 (2006), 1156-1188. Zbl 1100.65054, MR 2220320, 10.1016/j.amc.2005.08.027
Reference: [20] Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition.Comput. Optim. Appl. 28 (2004), 203-225. Zbl 1056.90130, MR 2072533, 10.1023/B:COAP.0000026885.81997.88
Reference: [21] Yuan, G., Wei, Z., Li, G.: A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs.J. Comput. Appl. Math. 255 (2014), 86-96. Zbl 1291.90315, MR 3093406, 10.1016/j.cam.2013.04.032
Reference: [22] Zheng, Y., Zheng, B.: Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems.J. Optim. Theory Appl. 175 (2017), 502-509. Zbl 1380.65108, MR 3717341, 10.1007/s10957-017-1140-1
Reference: [23] Zhou, Q., Hang, D.: Nonmonotone adaptive trust region method with line search based on new diagonal updating.Appl. Numer. Math. 91 (2015), 75-88. Zbl 1310.65070, MR 3312325, 10.1016/j.apnum.2014.12.009
Reference: [24] Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition.Optim. Methods Softw. 21 (2006), 707-714. Zbl 1112.90096, MR 2238653, 10.1080/10556780500137041
.

Fulltext not available (moving wall 24 months)

Partner of
EuDML logo