[1] Abdallah, S. A., Plumbley, M. D.: Predictive Information, Multi-Information and Binding Information. Technical Report C4DM-TR-10-10, Queen Mary University of London, 2010.
[2] Ahlswede, R.:
An elementary proof of the strong converse theorem for the multiple-access channel. J. Combin. Inform. System Sci. 7 (1982), 3, 216-230.
MR 0724363
[6] Ay, N., Olbrich, E., Bertschinger, N., Jost, J.: A unifying framework for complexity measures of finite systems. Working Paper 06-08-028, Santa Fe Institute, 2006.
[9] Chung, F. R. K., Graham, R. L., Frankl, P., Shearer, J. B.:
Some intersection theorems for ordered sets and graphs. J. Combin. Theory Ser. A 43 (1986), 1, 23-37.
DOI 10.1016/0097-3165(86)90019-1 |
MR 0859293
[10] Coja-Oghlan, A., Krzakala, F., Perkins, W., Zdeborová, L.:
Information-theoretic thresholds from the cavity method. Adv. Math. 333 (2018), 694-795.
DOI 10.1016/j.aim.2018.05.029 |
MR 3818090
[12] Cover, T. M., Thomas, J. A.:
Elements of Information Theory. Second edition. Wiley-Interscience, John Wiley and Sons, Hoboken, NJ 2006.
MR 2239987
[13] Crooks, G.: On Measures of Entropy and Information. Technical note.
[16] Dembo, A., Zeitouni, O.:
Large deviations techniques and applications. Second edition. Springer-Verlag, Stochastic Modelling and Applied Probability 38, Berlin 2010.
DOI 10.1007/978-1-4612-5320-4 |
MR 2571413
[17] Dobrušin, R. L.:
A general formulation of Shannon's fundamental theorem in the theory of information. Dokl. Akad. Nauk SSSR 126 (1959), 474-477.
MR 0107573
[18] Dougherty, R., Freiling, C., Zeger, K.:
Networks, matroids, and non-Shannon information inequalities. IEEE Trans. Inform. Theory 53 (2007), 6, 1949-1969.
DOI 10.1109/tit.2007.896862 |
MR 2321860
[20] Dueck, G.:
The strong converse of the coding theorem for the multiple-access channel. J. Combin. Inform. System Sci. 6 (1981), 3, 187-196.
MR 0652388
[21] Eldan, R.:
Gaussian-width gradient complexity, reverse log-sobolev inequalities and nonlinear large deviations. Geometr. Funct. Anal. 28 (2018), 6, 1548-1596.
DOI 10.1007/s00039-018-0461-z |
MR 3881829
[22] Eldan, R., Gross, R.:
Exponential random graphs behave like mixtures of stochastic block models. Preprint, available online at
DOI |
MR 3861824
[23] Eldan, R., Gross, R.:
Decomposition of mean-field Gibbs distributions into product measures. Electron. J. Probab. 23 (2018), 35, 24.
DOI 10.1214/18-EJP159 |
MR 3798245
[24] Ellis, D., Friedgut, E., Kindler, G., Yehudayoff, A.:
Geometric stability via information theory. Discrete Anal. 10 (2016), 28 pp.
DOI 10.19086/da.784 |
MR 3555193
[27] Gelfand, I., Kolmogorov, A., Yaglom, I.:
On the general definition of the quantity of information. Doklady Akad. Nauk SSSR 111 (1956), 4, 745-748.
MR 0084440
[32] Madiman, M., Tetali, P.:
Information inequalities for joint distributions, with interpretations and applications. IEEE Trans. Inform. Theory 56 (2010), 6, 2699-2713.
DOI 10.1109/tit.2010.2046253 |
MR 2683430
[33] Makarychev, K., Makarychev, Y., Romashchenko, A., Vereshchagin, N.:
A new class of non-Shannon-type inequalities for entropies. Commun. Inf. Syst. 2 (2002), 2, 147-165.
DOI 10.4310/cis.2002.v2.n2.a3 |
MR 1958013
[35] Marton, K.:
Bounding {$\overline d$}-distance by informational divergence: a method to prove measure concentration. Ann. Probab. 24 (1996), 2, 857-866.
DOI 10.1214/aop/1039639365 |
MR 1404531
[37] McDiarmid, C.:
On the method of bounded differences. In: Surveys in Combinatorics, Norwich 1989, London Math. Soc. Lecture Note Ser. 141, Cambridge Univ. Press, Cambridge 1989, pp. 148-188.
DOI 10.1017/cbo9781107359949.008 |
MR 1036755
[39] Pearl, J., Paz, A.: Graphoids: a graph-based logic for reasoning about relevance relations. In: Advances in Artificial Intelligence - II (B. Du Boulay, D. Hogg, and L. Steels, eds.), North Holland, Amsterdam 1987, pp. 357-363.
[40] Perez, A.:
Information theory with an abstract alphabet. Generalized forms of McMillan's limit theorem for the case of discrete and continuous times. Theory Probab. Appl. 4 (1959), 99-102.
DOI 10.1137/1104007 |
MR 0122613
[41] Perez, A.:
$\epsilon $-admissible simplifications of the dependence structure of a set of random variables. Kybernetika 13 (1977), 6, 439-449.
MR 0472224
[42] Pinsker, M. S.:
Information and information Stability of Random Variables and Processes. Holden-Day, Inc., San Francisco 1964.
MR 0213190
[43] Radhakrishnan, J.: Entropy and counting. In: Computational Mathematics, Modelling and Applications (IIT Kharagpur, Golden Jubilee Volume) (J. Mishra, ed.), Narosa Publishers, 2001, pp. 146-168.
[44] Schneidman, E., Still, S., Berry, M. J., Bialek, W.:
Network information and connected correlations. Phys. Rev. Lett. 91 (2003), 238701.
DOI 10.1103/physrevlett.91.238701
[45] Studený, M., Vejnarová, J.:
The multiinformation function as a tool for measuring stochastic dependence. In: Proc. NATO Advanced Study Institute on Learning in Graphical Models, Kluwer Academic Publishers, Norwell 1998, pp. 261-297.
DOI 10.1007/978-94-011-5014-9_10
[46] Timme, N., Alford, W., Flecker, B., Beggs, J. M.:
Synergy, redundancy, and multivariate information measures: An experimentalist's perspective. J. Comput. Neurosci. 36 (2014), 2, 119-140.
DOI 10.1007/s10827-013-0458-4 |
MR 3176934
[48] Yan, J.:
Nonlinear large deviations: beyond the hypercube. Preprint, available online at
DOI |
MR 4108123
[49] Zhang, Z., Yeung, R. W.:
On characterization of entropy function via information inequalities. IEEE Trans. Inform. Theory 44 (1998), 4, 1440-1452.
DOI 10.1109/18.681320 |
MR 1665794