Previous |  Up |  Next

Article

Title: Adjustment of the scaling parameter of Dai-Kou type conjugate gradient methods with application to motion control (English)
Author: Akbari, Mahbube
Author: Nezhadhosein, Saeed
Author: Heydari, Aghile
Language: English
Journal: Applications of Mathematics
ISSN: 0862-7940 (print)
ISSN: 1572-9109 (online)
Volume: 69
Issue: 6
Year: 2024
Pages: 829-845
Summary lang: English
.
Category: math
.
Summary: We introduce a new scaling parameter for the Dai-Kou family of conjugate gradient algorithms (2013), which is one of the most numerically efficient methods for unconstrained optimization. The suggested parameter is based on eigenvalue analysis of the search direction matrix and minimizing the measure function defined by Dennis and Wolkowicz (1993). The corresponding search direction of conjugate gradient method has the sufficient descent property and the extended conjugacy condition. The global convergence of the proposed algorithm is given for both uniformly convex and general nonlinear objective functions. Also, numerical experiments on a set of test functions of the CUTER collections and the practical problem of the manipulator of robot movement control show that the proposed method is effective. (English)
Keyword: unconstrained optimization problem
Keyword: self-scaling memoryless BFGS
Keyword: conjugate gradient
Keyword: measure function
MSC: 65K99
MSC: 90C06
MSC: 90C53
DOI: 10.21136/AM.2024.0006-24
.
Date available: 2024-12-13T19:02:00Z
Last updated: 2024-12-16
Stable URL: http://hdl.handle.net/10338.dmlcz/152671
.
Reference: [1] Abubakar, A. B., Kumam, P., Malik, M., Ibrahim, A. H.: A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems.Math. Comput. Simul. 201 (2022), 640-657. Zbl 1540.90255, MR 4439403, 10.1016/j.matcom.2021.05.038
Reference: [2] Aminifard, Z., Babaie-Kafaki, S., Ghafoori, S.: An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing.Appl. Numer. Math. 167 (2021), 187-201. Zbl 1467.65060, MR 4258711, 10.1016/j.apnum.2021.05.002
Reference: [3] Andrei, N.: Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization.Optim. Methods Softw. 32 (2017), 534-551. Zbl 1368.49057, MR 3630455, 10.1080/10556788.2016.1225211
Reference: [4] Andrei, N.: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method.Calcolo 57 (2020), Article ID 17, 27 pages. Zbl 1445.90100, MR 4102811, 10.1007/s10092-020-00365-7
Reference: [5] Babaie-Kafaki, S.: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae.J. Optim. Theory Appl. 167 (2015), 91-101. Zbl 1327.90394, MR 3395207, 10.1007/s10957-015-0724-x
Reference: [6] Byrd, R. H., Nocedal, J.: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization.SIAM J. Numer. Anal. 26 (1989), 727-739. Zbl 0676.65061, MR 0997665, 10.1137/0726042
Reference: [7] Dai, Y.-H., Kou, C.-X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search.SIAM J. Optim. 23 (2013), 296-320. Zbl 1266.49065, MR 3033109, 10.1137/100813026
Reference: [8] Bouter, M. L. de Leeuw den, Gijzen, M. B. van, Remis, R. F.: CG variants for general-form regularization with an application to low-field MRI.Numerical Mathematics and Advanced Applications: ENUMATH 2019 Lecture Notes in Computational Science and Engineering 139. Springer, Cham (2019), 673-681. Zbl 1475.65038, MR 4266546, 10.1007/978-3-030-55874-1_66
Reference: [9] J. E. Dennis, Jr., H. Wolkowicz: Sizing and least-change secant methods.SIAM J. Numer. Anal. 30 (1993), 1291-1314. Zbl 0802.65081, MR 1239822, 10.1137/0730067
Reference: [10] Dolan, E. D., Moré, J. J.: Benchmarking optimization software with performance profiles.Math. Program. 91 (2002), 201-213. Zbl 1049.90004, MR 1875515, 10.1007/s101070100263
Reference: [11] Esmaeili, H., Shabani, S., Kimiaei, M.: A new generalized shrinkage conjugate gradient method for sparse recovery.Calcolo 56 (2019), Article ID 1, 38 pages. Zbl 1461.65147, MR 3882971, 10.1007/s10092-018-0296-x
Reference: [12] Gould, N. I. M., Orban, D., Toint, P. L.: CUTEr and SifDec: A constrained and unconstrained testing environment, revisited.ACM Trans. Math. Softw. 29 (2003), 373-394. Zbl 1068.90526, 10.1145/962437.96243
Reference: [13] Hager, W. W., Zhang, H.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent.ACM Trans. Math. Softw. 32 (2006), 113-137. Zbl 1346.90816, MR 2272354, 10.1145/1132973.1132979
Reference: [14] Kaporin, I. E.: New convergence results and preconditioning strategies for the conjugate gradient method.Numer. Linear Algebra Appl. 1 (1994), 179-210. Zbl 0837.65027, MR 1277801, 10.1002/nla.1680010208
Reference: [15] Kratzer, D., Parter, S. V., Steuerwalt, M.: Block splittings for the conjugate gradient method.Comput. Fluids 11 (1983), 255-279. Zbl 0526.76003, MR 0726692, 10.1016/0045-7930(83)90015-4
Reference: [16] Li, W., Liu, Y., Yang, J., Wu, W.: A new conjugate gradient method with smoothing $L_{1/2}$ regularization based on a modified secant equation for training neural networks.Neural Process. Lett. 48 (2018), 955-978. 10.1007/s11063-017-9737-9
Reference: [17] Nezhadhosein, S.: A modified descent spectral conjugate gradient method for unconstrained optimization.Iran. J. Sci. Technol., Trans. A, Sci. 45 (2021), 209-220. MR 4208137, 10.1007/s40995-020-01012-0
Reference: [18] Nocedal, J., Wright, S. J.: Numerical Optimization.Springer Series in Operations Research and Financial Engineering. Springer, New York (2006). Zbl 1104.65059, MR 2244940, 10.1007/978-0-387-40065-5
Reference: [19] Perry, A.: A class of conjugate gradient algorithms with a two-step variable metric memory.Discussion paper 269 (1977), 16 pages Available at https://EconPapers.repec.org/RePEc:nwu:cmsems:269\kern0pt.
Reference: [20] Shanno, D. F.: Conjugate gradient methods with inexact searches.Math. Oper. Res. 3 (1978), 244-256. Zbl 0399.90077, MR 0506662, 10.1287/moor.3.3.244
Reference: [21] Sun, M., Liu, J., Wang, Y.: Two improved conjugate gradient methods with application in compressive sensing and motion control.Math. Probl. Eng. 2020 (2020), Article ID 9175496, 11 pages. Zbl 7347959, MR 4099297, 10.1155/2020/9175496
Reference: [22] Watkins, D. S.: Fundamentals of Matrix Computations.Pure and Applied Mathematics. John Wiley & Sons, New York (2002). Zbl 1005.65027, MR 1899577, 10.1002/0471249718
Reference: [23] Winther, R.: Some superlinear convergence results for the conjugate gradient method.SIAM J. Numer. Anal. 17 (1980), 14-17. Zbl 0447.65021, MR 0559456, 10.1137/071700
Reference: [24] Wolfe, P.: Convergence conditions for ascent methods.SIAM Rev. 11 (1969), 226-235. Zbl 0177.20603, MR 0250453, 10.1137/1011036
Reference: [25] Wolfe, P.: Convergence conditions for ascent methods. II: Some corrections.SIAM Rev. 13 (1971), 185-188. Zbl 0216.26901, MR 0288943, 10.1137/1013035
.

Fulltext not available (moving wall 24 months)

Partner of
EuDML logo