Previous |  Up |  Next

Article

Title: Multi-variate correlation and mixtures of product measures (English)
Author: Austin, Tim
Language: English
Journal: Kybernetika
ISSN: 0023-5954 (print)
ISSN: 1805-949X (online)
Volume: 56
Issue: 3
Year: 2020
Pages: 459-499
Summary lang: English
.
Category: math
.
Summary: Total correlation (`TC') and dual total correlation (`DTC') are two classical ways to quantify the correlation among an $n$-tuple of random variables. They both reduce to mutual information when $n=2$. The first part of this paper sets up the theory of TC and DTC for general random variables, not necessarily finite-valued. This generality has not been exposed in the literature before. The second part considers the structural implications when a joint distribution $\mu$ has small TC or DTC. If $\mathrm{TC}(\mu) = o(n)$, then $\mu$ is close to a product measure according to a suitable transportation metric: this follows directly from Marton's classical transportation-entropy inequality. If $\mathrm{DTC}(\mu) = o(n)$, then the structural consequence is more complicated: $\mu$ is a mixture of a controlled number of terms, most of them close to product measures in the transportation metric. This is the main new result of the paper. (English)
Keyword: total correlation
Keyword: dual total correlation
Keyword: transportation inequalities
Keyword: mixtures of products
MSC: 60B99
MSC: 60G99
MSC: 62B10
MSC: 94A17
idZBL: Zbl 07250733
idMR: MR4131739
DOI: 10.14736/kyb-2020-3-0459
.
Date available: 2020-09-02T09:21:15Z
Last updated: 2021-02-23
Stable URL: http://hdl.handle.net/10338.dmlcz/148310
.
Reference: [1] Abdallah, S. A., Plumbley, M. D.: Predictive Information, Multi-Information and Binding Information..Technical Report C4DM-TR-10-10, Queen Mary University of London, 2010.
Reference: [2] Ahlswede, R.: An elementary proof of the strong converse theorem for the multiple-access channel..J. Combin. Inform. System Sci. 7 (1982), 3, 216-230. MR 0724363
Reference: [3] Ahlswede, R.: The rate-distortion region for multiple descriptions without excess rate..IEEE Trans. Inform. Theory 31 (1985), 6, 721-726. MR 0823593, 10.1109/tit.1985.1057102
Reference: [4] Austin, T.: The structure of low-complexity Gibbs measures on product spaces..Ann. Probab. 47 (2019), 6, 4002-4023. MR 4038047, 10.1214/19-aop1352
Reference: [5] Austin, T.: Measure concentration and the weak Pinsker property..Publ. Math. Inst. Hautes Etudes Sci. 128 (2018), 1-119. MR 3905465, 10.1007/s10240-018-0098-3
Reference: [6] Ay, N., Olbrich, E., Bertschinger, N., Jost, J.: A unifying framework for complexity measures of finite systems..Working Paper 06-08-028, Santa Fe Institute, 2006.
Reference: [7] Balister, P., Bollobás, B.: Projections, entropy and sumsets..Combinatorica 32 (2012), 2, 125-141. MR 2927635, 10.1007/s00493-012-2453-1
Reference: [8] Chatterjee, S., Dembo, A.: Nonlinear large deviations..Adv. Math. 299 (2016), 396-450. MR 3519474, 10.1016/j.aim.2016.05.017
Reference: [9] Chung, F. R. K., Graham, R. L., Frankl, P., Shearer, J. B.: Some intersection theorems for ordered sets and graphs..J. Combin. Theory Ser. A 43 (1986), 1, 23-37. MR 0859293, 10.1016/0097-3165(86)90019-1
Reference: [10] Coja-Oghlan, A., Krzakala, F., Perkins, W., Zdeborová, L.: Information-theoretic thresholds from the cavity method..Adv. Math. 333 (2018), 694-795. MR 3818090, 10.1016/j.aim.2018.05.029
Reference: [11] Coja-Oghlan, A., Perkins, W.: Bethe states of random factor graphs..Comm. Math. Phys. 366 (2019), 1, 173-201. MR 3919446, 10.1007/s00220-019-03387-7
Reference: [12] Cover, T. M., Thomas, J. A.: Elements of Information Theory. Second edition..Wiley-Interscience, John Wiley and Sons, Hoboken, NJ 2006. MR 2239987
Reference: [13] Crooks, G.: On Measures of Entropy and Information..Technical note.
Reference: [14] Csiszár, I.: Sanov property, generalized $I$-projection and a conditional limit theorem..Ann. Probab. 12 (1984), 3, 768-793. MR 0744233, 10.1214/aop/1176993227
Reference: [15] Csiszár, I., Narayan, P.: Secrecy capacities for multiple terminals..IEEE Trans. Inform. Theory 50 (2004), 12, 3047-3061. MR 2103483, 10.1109/tit.2004.838380
Reference: [16] Dembo, A., Zeitouni, O.: Large deviations techniques and applications. Second edition..Springer-Verlag, Stochastic Modelling and Applied Probability 38, Berlin 2010. MR 2571413, 10.1007/978-1-4612-5320-4
Reference: [17] Dobrušin, R. L.: A general formulation of Shannon's fundamental theorem in the theory of information..Dokl. Akad. Nauk SSSR 126 (1959), 474-477. MR 0107573
Reference: [18] Dougherty, R., Freiling, C., Zeger, K.: Networks, matroids, and non-Shannon information inequalities..IEEE Trans. Inform. Theory 53 (2007), 6, 1949-1969. MR 2321860, 10.1109/tit.2007.896862
Reference: [19] Dudley, R. M.: Real Analysis and Probability..Cambridge University Press, Cambridge Studies in Advanced Mathematics 74, Cambridge 2002. Zbl 1023.60001, MR 1932358, 10.1017/cbo9780511755347
Reference: [20] Dueck, G.: The strong converse of the coding theorem for the multiple-access channel..J. Combin. Inform. System Sci. 6 (1981), 3, 187-196. MR 0652388
Reference: [21] Eldan, R.: Gaussian-width gradient complexity, reverse log-sobolev inequalities and nonlinear large deviations..Geometr. Funct. Anal. 28 (2018), 6, 1548-1596. MR 3881829, 10.1007/s00039-018-0461-z
Reference: [22] Eldan, R., Gross, R.: Exponential random graphs behave like mixtures of stochastic block models..Preprint, available online at MR 3861824,
Reference: [23] Eldan, R., Gross, R.: Decomposition of mean-field Gibbs distributions into product measures..Electron. J. Probab. 23 (2018), 35, 24. MR 3798245, 10.1214/18-EJP159
Reference: [24] Ellis, D., Friedgut, E., Kindler, G., Yehudayoff, A.: Geometric stability via information theory..Discrete Anal. 10 (2016), 28 pp. MR 3555193, 10.19086/da.784
Reference: [25] Fritz, T., Chaves, R.: Entropic inequalities and marginal problems..IEEE Trans. Inform. Theory 59 (2013), 2, 803-817. MR 3015697, 10.1109/tit.2012.2222863
Reference: [26] Fujishige, S.: Polymatroidal dependence structure of a set of random variables..Inform. Control 39 (1978), 1, 55-72. MR 0514262, 10.1016/s0019-9958(78)91063-x
Reference: [27] Gelfand, I., Kolmogorov, A., Yaglom, I.: On the general definition of the quantity of information..Doklady Akad. Nauk SSSR 111 (1956), 4, 745-748. MR 0084440
Reference: [28] Han, T. S.: Linear dependence structure of the entropy space..Inform. Control 29 (1975), 4, 337-368. MR 0453264, 10.1016/s0019-9958(75)80004-0
Reference: [29] Han, T. S.: Nonnegative entropy measures of multivariate symmetric correlations..Inform. Control 36 (1978), 2, 133-156. MR 0464499, 10.1016/s0019-9958(78)90275-9
Reference: [30] Kolmogorov, A. N.: On the shannon theory of information transmission in the case of continuous signals..IEEE Trans. Inform. Theory IT-2 (1956), 102-108. MR 0097987, 10.1016/s0019-9958(78)90275-9
Reference: [31] Ledoux, M.: On {T}alagrand's deviation inequalities for product measures..ESAIM Probab. Statist. 1 (1995/97), 63-87. MR 1399224, 10.1051/ps:1997103
Reference: [32] Madiman, M., Tetali, P.: Information inequalities for joint distributions, with interpretations and applications..IEEE Trans. Inform. Theory 56 (2010), 6, 2699-2713. MR 2683430, 10.1109/tit.2010.2046253
Reference: [33] Makarychev, K., Makarychev, Y., Romashchenko, A., Vereshchagin, N.: A new class of non-Shannon-type inequalities for entropies..Commun. Inf. Syst. 2 (2002), 2, 147-165. MR 1958013, 10.4310/cis.2002.v2.n2.a3
Reference: [34] Marton, K.: A simple proof of the blowing-up lemma..IEEE Trans. Inform. Theory 32 (1986), 3, 445-446. MR 0838213, 10.1109/tit.1986.1057176
Reference: [35] Marton, K.: Bounding {$\overline d$}-distance by informational divergence: a method to prove measure concentration..Ann. Probab. 24 (1996), 2, 857-866. MR 1404531, 10.1214/aop/1039639365
Reference: [36] Matúš, F.: Two constructions on limits of entropy functions..IEEE Trans. Inform. Theory 53 (2007), 1, 320-330. MR 2292891, 10.1109/tit.2006.887090
Reference: [37] McDiarmid, C.: On the method of bounded differences..In: Surveys in Combinatorics, Norwich 1989, London Math. Soc. Lecture Note Ser. 141, Cambridge Univ. Press, Cambridge 1989, pp. 148-188. MR 1036755, 10.1017/cbo9781107359949.008
Reference: [38] McGill, W. J.: Multivariate information transmission..Trans. I.R.E. PGIT-4 (1954), 93-111. MR 0088155, 10.1109/tit.1954.1057469
Reference: [39] Pearl, J., Paz, A.: Graphoids: a graph-based logic for reasoning about relevance relations..In: Advances in Artificial Intelligence - II (B. Du Boulay, D. Hogg, and L. Steels, eds.), North Holland, Amsterdam 1987, pp. 357-363.
Reference: [40] Perez, A.: Information theory with an abstract alphabet. Generalized forms of McMillan's limit theorem for the case of discrete and continuous times..Theory Probab. Appl. 4 (1959), 99-102. MR 0122613, 10.1137/1104007
Reference: [41] Perez, A.: $\epsilon $-admissible simplifications of the dependence structure of a set of random variables..Kybernetika 13 (1977), 6, 439-449. MR 0472224
Reference: [42] Pinsker, M. S.: Information and information Stability of Random Variables and Processes..Holden-Day, Inc., San Francisco 1964. MR 0213190
Reference: [43] Radhakrishnan, J.: Entropy and counting..In: Computational Mathematics, Modelling and Applications (IIT Kharagpur, Golden Jubilee Volume) (J. Mishra, ed.), Narosa Publishers, 2001, pp. 146-168.
Reference: [44] Schneidman, E., Still, S., Berry, M. J., Bialek, W.: Network information and connected correlations..Phys. Rev. Lett. 91 (2003), 238701. 10.1103/physrevlett.91.238701
Reference: [45] Studený, M., Vejnarová, J.: The multiinformation function as a tool for measuring stochastic dependence..In: Proc. NATO Advanced Study Institute on Learning in Graphical Models, Kluwer Academic Publishers, Norwell 1998, pp. 261-297. 10.1007/978-94-011-5014-9_10
Reference: [46] Timme, N., Alford, W., Flecker, B., Beggs, J. M.: Synergy, redundancy, and multivariate information measures: An experimentalist's perspective..J. Comput. Neurosci. 36 (2014), 2, 119-140. MR 3176934, 10.1007/s10827-013-0458-4
Reference: [47] Watanabe, S.: Information theoretical analysis of multivariate correlation..IBM J. Res. Develop. 4 (1960), 66-82. MR 0109755, 10.1147/rd.41.0066
Reference: [48] Yan, J.: Nonlinear large deviations: beyond the hypercube..Preprint, available online at MR 4108123,
Reference: [49] Zhang, Z., Yeung, R. W.: On characterization of entropy function via information inequalities..IEEE Trans. Inform. Theory 44 (1998), 4, 1440-1452. MR 1665794, 10.1109/18.681320
.

Files

Files Size Format View
Kybernetika_56-2020-3_4.pdf 648.5Kb application/pdf View/Open
Back to standard record
Partner of
EuDML logo