Previous |  Up |  Next

Article

Full entry | Fulltext not available (moving wall 24 months)      Feedback
Keywords:
iterative hard thresholding; signal reconstruction; classification problem; least squares support vector machine
Summary:
We provide a theoretical study of the iterative hard thresholding with partially known support set (IHT-PKS) algorithm when used to solve the compressed sensing recovery problem. Recent work has shown that IHT-PKS performs better than the traditional IHT in reconstructing sparse or compressible signals. However, less work has been done on analyzing the performance guarantees of IHT-PKS. In this paper, we improve the current RIP-based bound of IHT-PKS algorithm from $\delta _{3s-2k}<\smash {\frac {1}{\sqrt {32}}}\approx 0.1768$ to $\delta _{3s-2k}<\frac {\sqrt {5}-1}{4}\approx 0.309$, where $\delta _{3s-2k}$ is the restricted isometric constant of the measurement matrix. We also present the conditions for stable reconstruction using the ${\rm IHT}^{\mu }$-PKS algorithm which is a general form of IHT-PKS. We further apply the algorithm on Least Squares Support Vector Machines (LS-SVM), which is one of the most popular tools for regression and classification learning but confronts the loss of sparsity problem. After the sparse representation of LS-SVM is presented by compressed sensing, we exploit the support of bias term in the LS-SVM model with the ${\rm IHT}^{\mu }$-PKS algorithm. Experimental results on classification problems show that ${\rm IHT}^{\mu }$-PKS outperforms other approaches to computing the sparse LS-SVM classifier.
References:
[1] Axiotis, K., Sviridenko, M.: Sparse convex optimization via adaptively regularized hard thresholding. J. Mach. Learn. Res. 22 (2021), Article ID 122, 47 pages. MR 4279773 | Zbl 07370639
[2] Baraniuk, R. G., Cevher, V., Duarte, M. F., Hegde, C.: Model-based compressive sensing. IEEE Trans. Inf. Theory 56 (2010), 1982-2001. DOI 10.1109/TIT.2010.2040894 | MR 2654489 | Zbl 1366.94215
[3] Besson, A., Perdios, D., Wiaux, Y., Thiran, J.-P.: Joint sparsity with partially known support and application to ultrasound imaging. IEEE Signal Process. Lett. 26 (2019), 84-88. DOI 10.110/LSP.2018.2880571
[4] Blumensath, T., Davies, M. E.: Iterative thresholding for sparse approximations. J. Fourier Anal. Appl. 14 (2008), 629-654. DOI 10.1007/s00041-008-9035-z | MR 2461601 | Zbl 1175.94060
[5] Blumensath, T., Davies, M. E.: Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27 (2009), 265-274. DOI 10.1016/j.acha.2009.04.002 | MR 2559726 | Zbl 1174.94008
[6] Blumensath, T., Davies, M. E.: Normalized iterative hard thresholding: Guaranteed stability and performance. IEEE J. Selected Topics Signal Process. 4 (2010), 298-309. DOI 10.1109/JSTSP.2010.2042411
[7] Candès, E. J., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51 (2005), 4203-4215. DOI 10.1109/TIT.2005.858979 | MR 2243152 | Zbl 1264.94121
[8] Carrillo, R. E., Polania, L. F., Barner, K. E.: Iterative algorithms for compressed sensing with partially known support. IEEE International Conference on Acoustics, Speech and Signal Processing IEEE, Los Alamitos (2010), 3654-3657. DOI 10.1109/ICASSP.2010.5495901
[9] Carrillo, R. E., Polania, L. F., Barner, K. E.: Iterative hard thresholding for compressed sensing with partially known support. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) IEEE, Los Alamitos (2011), 4028-4031. DOI 10.1109/ICASSP.2011.5947236
[10] Chen, S. S., Donoho, D. L., Saunders, M. A.: Atomic decomposition by basis pursuit. SIAM Rev. 43 (2001), 129-159. DOI 10.1137/S003614450037906X | MR 1854649 | Zbl 0979.94010
[11] Cui, K., Song, Z., Han, N.: Fast thresholding algorithms with feedbacks and partially known support for compressed sensing. Asia-Pac. J. Oper. Res. 37 (2020), Article ID 2050013, 20 pages. DOI 10.1142/S021759592050013X | MR 4107957 | Zbl 1457.94037
[12] Foucart, S.: A note on guaranteed sparse recovery via $\ell_1$-minimization. Appl. Comput. Harmon. Anal. 29 (2010), 97-103. DOI 10.1016/j.acha.2009.10.004 | MR 2647014 | Zbl 1198.41011
[13] Foucart, S.: Hard thresholding pursuit: An algorithm for compressive sensing. SIAM J. Numer. Anal. 49 (2011), 2543-2563. DOI 10.1137/100806278 | MR 2873246 | Zbl 1242.65060
[14] Foucart, S., Rauhut, H.: A Mathematical Introduction to Compressive Sensing. Applied and Numerical Harmonic Analysis. Birkhäuser, New York (2013). DOI 10.1007/978-0-8176-4948-7 | MR 3100033 | Zbl 1315.94002
[15] Jacques, L.: A short note on compressed sensing with partially known signal support. Signal Process. 90 (2010), 3308-3312. DOI 10.1016/j.sigpro.2010.05.025 | Zbl 1197.94063
[16] Khajehnejad, M. A., Xu, W., Avestimehr, A. S., Hassibi, B.: Weighted $\ell_1$ minimization for sparse recovery with prior information. IEEE International Symposium on Information Theory IEEE, Los Alamitos (2009), 483-487. DOI 10.1109ISIT.2009.5205716 | MR 2816477
[17] Khera, S., Singh, S.: Estimation of channel for millimeter-wave hybrid massive MIMO systems using Orthogonal Matching Pursuit (OMP). J. Phys., Conf. Ser. 2327 (2022), Article ID 012040, 9 pages. DOI 10.1088/1742-6596/2327/1/012040
[18] Li, Y., Lin, C., Zhang, W.: Improved sparse least-squares support vector machine classifiers. Neurocomputing 69 (2006), 1655-1658. DOI 10.1016/j.neucom.2006.03.001
[19] Liu, H., Barber, R. Foygel: Between hard and soft thresholding: Optimal iterative thresholding algorithms. Inf. Inference 9 (2020), 899-933. DOI 10.1093/imaiai/iaz027 | MR 4188230 | Zbl 1473.90162
[20] Mall, R., Suykens, J. A. K.: Very sparse LSSVM reductions for large-scale data. IEEE Trans. Neural Netw. Learn. Syst. 26 (2015), 1086-1097. DOI 10.1109/TNNLS.2014.2333879 | MR 3454265
[21] Shao, Y.-H., Li, C.-N., Liu, M.-Z., Wang, Z., Deng, N.-Y.: Sparse $L_q$-norm least squares support vector machine with feature selection. Pattern Recognition 78 (2018), 167-181. DOI 10.1016/j.patcog.2018.01.016 | MR 3621691
[22] Shen, J., Li, P.: A tight bound of hard thresholding. J. Mach. Learn. Res. 18 (2018), Article ID 208, 42 pages. MR 3827096 | Zbl 1473.62287
[23] Suykens, J. A. K., Lukas, L., Vandewalle, J.: Sparse approximation using least squares support vector machines. IEEE International Symposium on Circuits and Systems (ISCAS). Vol. 2 IEEE, Los Alamitos (2000), 757-760. DOI 10.1109/ISCAS.2000.856439
[24] Vaswani, N., Lu, W.: Modified-CS: Modifying compressive sensing for problems with partially known support. IEEE Trans. Signal Process. 58 (2010), 4595-4607. DOI 10.1109/TSP.2010.2051150 | MR 2752724 | Zbl 1392.94045
[25] Borries, R. von, Miosso, C. J., Potes, C.: Compressed sensing using prior information. International Workshop on Computational Advances in Multi-Sensor Adaptive Processing IEEE, Los Alamitos (2007), 121-124. DOI 10.1109/CAMSAP.2007.4497980
[26] Wang, X.-Z., Xing, H.-J., Li, Y., Hua, Q., Dong, C.-R., Pedrycz, W.: A study on relationship between generalization abilities and fuzziness of base classifiers in ensemble learning. IEEE Trans. Fuzzy Syst. 23 (2015), 1638-1654. DOI 10.1109/TFUZZ.2014.2371479
[27] Yang, J., Bouzerdoum, A., Phung, S. L.: A training algorithm for sparse LS-SVM using compressive sampling. IEEE International Conference on Acoustics, Speech and Signal Processing IEEE, Los Alamitos (2010), 2054-2057. DOI 10.1109/ICASSP.2010.5495015 | MR 0642901
[28] Yang, J., Ma, J.: A sparsity-based training algorithm for Least Squares SVM. IEEE Symposium on Computational Intelligence and Data Mining (CIDM) IEEE, Los Alamitos (2014), 345-350. DOI 10.1109/CIDM.2014.7008688
[29] Yu, M., Gupta, V., Kolar, M.: Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach. Electron. J. Stat. 14 (2020), 413-457. DOI 10.1214/19-EJS1658 | MR 4054252 | Zbl 1434.90161
[30] Zhang, X., Xu, W., Cui, Y., Lu, L., Lin, J.: On recovery of block sparse signals via block compressive sampling matching pursuit. IEEE Access 7 (2019), 175554-175563. DOI 10.1109/ACCESS.2019.2955759
[31] Zhao, Y.-B., Luo, Z.-Q.: Improved rip-based bounds for guaranteed performance of two compressed sensing algorithms. Available at https://arxiv.org/abs/2007.01451 (2020), 10 pages. MR 4577493
Partner of
EuDML logo