Previous |  Up |  Next

Article

Keywords:
sparse inverse covariance selection; regularization; graphical models; entropy; optimization
Summary:
Graphical models provide an undirected graph representation of relations between the components of a random vector. In the Gaussian case such an undirected graph is used to describe conditional independence relations among such components. In this paper, we consider a continuous-time Gaussian model which is accessible to observations only at time $T$. We introduce the concept of infinitesimal conditional independence for such a model. Then, we address the corresponding graphical model selection problem, i. e. the problem to estimate the graphical model from data. Finally, simulation studies are proposed to test the effectiveness of the graphical model selection procedure.
References:
[1] Alpago, D., Zorzi, M., Ferrante, A.: Identification of sparse reciprocal graphical models. IEEE Control Systems Lett. 2 (2018), 4, 659-664. DOI 10.1109/lcsys.2018.2845943
[2] Avventi, E., Lindquist, A., Wahlberg, B.: ARMA identification of graphical models. IEEE Trans. Automat. Control 58 (2013), 1167-1178. DOI 10.1109/tac.2012.2231551 | MR 3047919
[3] Baggio, G.: Further results on the convergence of the Pavon-Ferrante algorithm for spectral estimation. IEEE Trans. Automat- Control 63 (2018), 10, 3510-3515. DOI 10.1109/tac.2018.2794407 | MR 3866257
[4] Banerjee, O., Ghaoui, L. El, d'Aspremont, A.: Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data. J. Machine Learning Res. 9 (2008), 485-516. MR 2417243
[5] Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge Univ. Press, Cambridge 2004. DOI 10.1017/cbo9780511804441 | MR 2061575 | Zbl 1058.90049
[6] Byrnes, C., Gusev, S., Lindquist, A.: A convex optimization approach to the rational covariance extension problem. SIAM J. Optim. 37 (1998), 211-229. DOI 10.1137/s0363012997321553 | MR 1642019
[7] Byrnes, C. I., Georgiou, T. T., Lindquist, A.: A new approach to spectral estimation: A tunable high-resolution spectral estimator. IEEE Trans. Signal Process. 48 (2000), 3189-3205. DOI 10.1109/78.875475 | MR 1791083
[8] Candes, E., Plan, Y.: Matrix completion with noise. Proc. IEEE 98 (2010), 925-936. DOI 10.1109/jproc.2009.2035722
[9] Candes, E., Recht, B.: Exact matrix completion via convex optimization. Comm. ACM 55 (2012), 111-119. DOI 10.1145/2184319.2184343 | MR 2565240
[10] Chandrasekaran, V., Parrilo, P., Willsky, A.: Latent variable graphical model selection via convex optimization. Ann. Statist. 40 (2010), 1935-2013. DOI 10.1214/12-aos1020 | MR 3059067
[11] Chandrasekaran, V., Shah, P.: Relative entropy optimization and its applications. Math. Program. 161 (2017), (1-2), 1-32. DOI 10.1007/s10107-016-0998-2 | MR 3592772
[12] Cover, T., Thomas, J.: Information Theory. Wiley, New York 1991. DOI 10.1002/0471200611
[13] d'Aspremont, A., Banerjee, O., Ghaoui, L. El: First-order methods for sparse covariance selection. SIAM J. Matrix Analysis Appl. 30 (2008), 56-66. DOI 10.1137/060670985 | MR 2399568
[14] Dempster, A.: Covariance selection. Biometrics 28 (1972), 157-175. DOI 10.2307/2528966 | MR 3931974
[15] Ferrante, A., Pavon, M.: Matrix completion à la Dempster by the principle of parsimony. IEEE Trans. Inform. Theory 57 (2011), 3925-3931. DOI 10.1109/tit.2011.2143970 | MR 2817064
[16] Ferrante, A., Pavon, M., Ramponi, F.: Hellinger versus Kullback-Leibler multivariable spectrum approximation. IEEE Trans. Autom. Control 53 (2008), 954-967. DOI 10.1109/tac.2008.920238 | MR 2419442
[17] Friedman, J., Hastie, T., Tibshirani, R.: Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9 (2008), 432-441. DOI 10.1093/biostatistics/kxm045
[18] Grant, M., Boyd, S.: CVX: Matlab software for disciplined convex programming, version 2.1. 2014.
[19] Gu, S., Betzel, R., Mattar, M., Cieslak, M., Delio, P., Grafton, S., Pasqualetti, F., Bassett, D.: Optimal trajectories of brain state transitions. NeuroImage 148 (2017), 305-317. DOI 10.1016/j.neuroimage.2017.01.003
[20] Huang, J., Liu, N., Pourahmadi, M., Liu, L.: Covariance matrix selection and estimation via penalised normal likelihood. Biometrika 93 (2006), 85-98. DOI 10.1093/biomet/93.1.85 | MR 2277742
[21] Huotari, N., Raitamaa, L., Helakari, H., Kananen, J., Raatikainen, V., Rasila, A., Tuovinen, T., Kantola, J., Borchardt, V., Kiviniemi, V., Korhonen, V.: Sampling rate effects on resting state fMRI metrics. Frontiers Neurosci. 13 (2019), 279. DOI 10.3389/fnins.2019.00279
[22] Jalali, A., Sanghavi, S.: Learning the dependence graph of time series with latent factors. In: International Conference on Machine Learning Edinburgh 2012.
[23] Koller, D., Friedman, N.: Probabilistic Graphical Models: Principles and Techniques. MIT Press, 2009. MR 2778120
[24] Lauritzen, S.: Graphical Models. Oxford University Press, Oxford 1996. MR 1419991
[25] Meinshausen, N., Bühlmann, P.: High-dimensional graphs and variable selection with the lasso. Annals Statist. 34 (2006), 1436-1462. DOI 10.1214/009053606000000281 | MR 2278363
[26] Pearl, J.: Graphical models for probabilistic and causal reasoning. In: Quantified representation of uncertainty and imprecision, Springer 1998, pp. 367-389. DOI 10.1007/978-94-017-1735-9_12 | MR 1743892
[27] Ringh, A., Karlsson, J., Lindquist, A.: Multidimensional rational covariance extension with approximate covariance matching. SIAM J. Control Optim. 56 (2018), 2, 913-944. DOI 10.1137/17m1127922 | MR 3775123
[28] Songsiri, J., Dahl, J., Vandenberghe, L.: Graphical models of autoregressive processes. In: Convex Optimization in Signal Processing and Communications (D. Palomar and Y. Eldar, eds.), Cambridge Univ. Press, Cambridge 2010, pp. 1-29. MR 2767565
[29] Songsiri, J., Vandenberghe, L.: Topology selection in graphical models of autoregressive processes. J. Machine Learning Res. 11 (2010), 2671-2705. MR 2738780
[30] Yue, Z., Thunberg, J., Ljung, L., Gonçalves, J.: Identification of sparse continuous-time linear systems with low sampling rate: Exploring matrix logarithms. arXiv preprint arXiv:1605.08590, 2016.
[31] {Zhu}, B., {Baggio}, G.: On the existence of a solution to a spectral estimation problem a la Byrnes-Georgiou-Lindquist. IEEE Trans. Automat. Control 64 (2019), 2, 820-825. DOI 10.1109/tac.2018.2836984 | MR 3912133
[32] Zorzi, M.: A new family of high-resolution multivariate spectral estimators. IEEE Trans. Automat. Control 59 (2014), 892-904. DOI 10.1109/tac.2013.2293218 | MR 3199341
[33] Zorzi, M.: Rational approximations of spectral densities based on the Alpha divergence. Math. Control Signals Systems 26 (2014), 259-278. DOI 10.1007/s00498-013-0118-2 | MR 3201948
[34] Zorzi, M.: An interpretation of the dual problem of the THREE-like approaches. Automatica 62 (2015), 87-92. DOI 10.1016/j.automatica.2015.09.023 | MR 3423974
[35] Zorzi, M.: Multivariate Spectral Estimation based on the concept of Optimal Prediction. IEEE Trans. Automat. Control 60 (2015), 1647-1652. DOI 10.1109/tac.2014.2359713 | MR 3353402
[36] Zorzi, M.: Empirical Bayesian learning in AR graphical models. Automatica 109 (2019), 108516. DOI 10.1016/j.automatica.2019.108516 | MR 3989933
[37] Zorzi, M., Sepulchre, R.: AR identification of latent-variable graphical models. IEEE Trans. Automat. Control 61 (2016), 2327-2340. DOI 10.1109/tac.2015.2491678 | MR 3545056
Partner of
EuDML logo