Previous |  Up |  Next

Article

Title: Tropical probability theory and an application to the entropic cone (English)
Author: Matveev, Rostislav
Author: Portegies, Jacobus W.
Language: English
Journal: Kybernetika
ISSN: 0023-5954 (print)
ISSN: 1805-949X (online)
Volume: 56
Issue: 6
Year: 2020
Pages: 1133-1153
Summary lang: English
.
Category: math
.
Summary: In a series of articles, we have been developing a theory of tropical diagrams of probability spaces, expecting it to be useful for information optimization problems in information theory and artificial intelligence. In this article, we give a summary of our work so far and apply the theory to derive a dimension-reduction statement about the shape of the entropic cone. (English)
Keyword: tropical probability
Keyword: entropic cone
Keyword: non-Shannon inequality
MSC: 94A17
MSC: 94A24
idMR: MR4199907
DOI: 10.14736/kyb-2020-6-1133
.
Date available: 2021-01-08T08:38:26Z
Last updated: 2021-03-29
Stable URL: http://hdl.handle.net/10338.dmlcz/148503
.
Reference: [1] Ahlswede, R., Körner, J.: On common information and related characteristics of correlated information sources..Preprint, 7th Prague Conference on Information Theory, 1974. MR 2495193
Reference: [2] Ahlswede, R., Körner, J.: On common information and related characteristics of correlated information sources..In: General Theory of Information Transfer and Combinatorics (R. Ahlswede et al., eds.), Lecture Notes in Computer Science 4123, Springer, Berlin, Heidelberg, 2006. MR 2495193
Reference: [3] Bertschinger, N., Rauh, J., Olbrich, E., Jost, J., Ay, N.: Quantifying unique information..Entropy 16 (2014), 4, 2161-2183. MR 3195286, 10.3390/e16042161
Reference: [4] Chan, T. H., Yeung, R. W: On a relation between information inequalities and group theory..IEEE Trans. Inform. Theory 48 (2002), 7, 1992-1995. MR 1930005, 10.1109/tit.2002.1013138
Reference: [5] Dougherty, R., Freiling, Ch., Zeger, K.: Six new non-shannon information inequalities..In: 2006 IEEE International Symposium on Information Theory, IEEE, 2006, pp. 233-236. MR 2321860, 10.1109/isit.2006.261840
Reference: [6] Dougherty, R., Freiling, Ch., Zeger, K.: Non-Shannon information inequalities in four random variables..arXiv preprint arXiv:1104.3602, 2011. MR 2321860
Reference: [7] Gromov, M.: In a search for a structure, part 1: On entropy..
Reference: [8] Kovačević, M., Stanojević, I., Šenk, V.: On the hardness of entropy minimization and related problems..In: 2012 IEEE Information Theory Workshop, IEEE, 2012, pp. 512-516. 10.3390/e22040407
Reference: [9] Leinster, T.: Basic Category Theory, volume 143.. MR 3307165
Reference: [10] Matúš, F.: Probabilistic conditional independence structures and matroid theory: background 1..Int. J. General System 22 (1993), 2, 185-196. 10.1080/03081079308935205
Reference: [11] Matúš, F.: Two constructions on limits of entropy functions..IEEE Trans. Inform. Theory 53 (2006), 1, 320-330. MR 2292891, 10.1109/tit.2006.887090
Reference: [12] Matúš, F.: Infinitely many information inequalities..In: IEEE International Symposium on Information Theory, ISIT 2007, IEEE, pp. 41-44. 10.1109/isit.2007.4557201
Reference: [13] Matúš, F., Csirmaz, L.: Entropy region and convolution..IEEE Trans. Inform. Theory 62 (2016), 11, 6007-6018. MR 3565097, 10.1109/tit.2016.2601598
Reference: [14] Matúš, F., Studený, M.: Conditional independences among four random variables i..Combinat. Probab. Comput. 4 (1995), 3, 269-278. MR 1356579, 10.1017/s0963548300001644
Reference: [15] Makarychev, K., Makarychev, Y., Romashchenko, A., Vereshchagin, N.: A new class of non-Shannon-type inequalities for entropies..Comm. Inform. Syst. 2 (2002), 2, 147-166. MR 1958013, 10.4310/cis.2002.v2.n2.a3
Reference: [16] Matveev, R., Portegies, J. W: Asymptotic dependency structure of multiple signals..Inform. Geometry 1 (2018), 2, 237-285. MR 4010749, 10.1007/s41884-018-0013-5
Reference: [17] Matveev, R., Portegies, J. W.: Arrow Contraction and Expansion in Tropical Diagrams..arXiv e-prints, page arXiv:1905.05597, 2019.
Reference: [18] Matveev, R., Portegies, J. W.: Conditioning in tropical probability theory..arXiv e-prints, page arXiv:1905.05596, 2019.
Reference: [19] Matveev, R., Portegies, J. W.: Tropical diagrams of probability spaces..arXiv e-prints, page arXiv:1905.04375, 2019. MR 4117580
Reference: [20] Slepian, D., Wolf, J.: Noiseless coding of correlated information sources..IEEE Trans. Inform. Theory 19 (1973), 4, 471-480. MR 0421858, 10.1109/tit.1973.1055037
Reference: [21] Vidyasagar, M.: A metric between probability distributions on finite sets of different cardinalities and applications to order reduction..IEEE Trans. Automat. Control 57 (2012), 10, 2464-2477. MR 2991650, 10.1109/tac.2012.2188423
Reference: [22] Wyner, A.: The common information of two dependent random variables..IEEE Trans. Inform. Theory 21 (1975), 2, 163-179. MR 0363679, 10.1109/tit.1975.1055346
Reference: [23] Yeung, R. W.: Information Theory and Network Coding..Springer Science and Business Media, 2008. 10.1007/978-0-387-79234-7_1
Reference: [24] Zhang, Z., Yeung, R. W.: A non-shannon-type conditional inequality of information quantities..IEEE Trans. Inform. Theory 43 (1997), 6, 1982-1986. MR 1481054, 10.1109/18.641561
Reference: [25] Zhang, Z., Yeung, R. W.: On characterization of entropy function via information inequalities..IEEE Trans. Inform. Theory 44 (1998), 4, 1440-1452. MR 1665794, 10.1109/18.681320
.

Files

Files Size Format View
Kybernetika_56-2020-6_8.pdf 662.6Kb application/pdf View/Open
Back to standard record
Partner of
EuDML logo