Previous |  Up |  Next

Article

Title: On typical encodings of multivariate ergodic sources (English)
Author: Kupsa, Michal
Language: English
Journal: Kybernetika
ISSN: 0023-5954 (print)
ISSN: 1805-949X (online)
Volume: 56
Issue: 6
Year: 2020
Pages: 1090-1110
Summary lang: English
.
Category: math
.
Summary: We show that the typical coordinate-wise encoding of multivariate ergodic source into prescribed alphabets has the entropy profile close to the convolution of the entropy profile of the source and the modular polymatroid that is determined by the cardinalities of the output alphabets. We show that the proportion of the exceptional encodings that are not close to the convolution goes to zero doubly exponentially. The result holds for a class of multivariate sources that satisfy asymptotic equipartition property described via the mean fluctuation of the information functions. This class covers asymptotically mean stationary processes with ergodic mean, ergodic processes, irreducible Markov chains with an arbitrary initial distribution. We also proved that typical encodings yield the asymptotic equipartition property for the output variables. These asymptotic results are based on an explicit lower bound of the proportion of encodings that transform a multivariate random variable into a variable with the entropy profile close to the suitable convolution. (English)
Keyword: entropy
Keyword: entropy rate
Keyword: multivariate source
Keyword: ergodic source
Keyword: a.e.p. property
MSC: 94A24
MSC: 94A29
idMR: MR4199905
DOI: 10.14736/kyb-2020-6-1090
.
Date available: 2021-01-08T08:35:48Z
Last updated: 2021-03-29
Stable URL: http://hdl.handle.net/10338.dmlcz/148501
.
Reference: [1] Bassoli, R., Marques, H., Rodriguez, J., Shum, K. W., Tafazolli, R.: Network coding theory: A survey..IEEE Commun. Surveys Tutor. 15 (2013), 1950-1978. 10.1109/surv.2013.013013.00104
Reference: [2] Cover, T. M., Thomas, J. A.: Elements of Information Theory..John Wiley and Sons, 2012. MR 2239987, 10.1002/0471200611
Reference: [3] Gray, R. M., Kieffer, J. C.: Asymptotically mean stationary measures..Ann. Probab. 8 (1980), 962-973. MR 0586779, 10.1214/aop/1176994624
Reference: [4] Gray, R. M.: Entropy and Information Theory..Springer Science and Business Media, 2011. MR 3134681
Reference: [5] Kaced, T.: Partage de secret et théorie algorithmique de l'information..PhD. Thesis, Université Montpellier 2, 2012.
Reference: [6] Kieffer, J. C.: A generalized Shannon-McMillan theorem for the action of an amenable group on a probability space..Ann. Probab. 3 (1975), 1031-1037. MR 0393422, 10.1214/aop/1176996230
Reference: [7] Matúš, F.: Two constructions on limits of entropy functions..IEEE Trans. Inform. Theory 53 (2007), 320-330. MR 2292891, 10.1109/tit.2006.887090
Reference: [8] Matúš, F., Csirmaz, L.: Entropy region and convolution..IEEE Trans. Inform. Theory 62 (2016), 6007-6018. MR 3565097, 10.1109/tit.2016.2601598
Reference: [9] Matúš, F., Kupsa, M.: On colorings of bivariate random sequences..In: Proc. IEEE International Symposium on Information Theory 2010, pp. 1272-1275. 10.1109/isit.2010.5513700
Reference: [10] Yeung, R. W.: Information Theory and Network Coding..Springer Science and Business Media, 2008. 10.1007/978-0-387-79234-7_1
.

Files

Files Size Format View
Kybernetika_56-2020-6_6.pdf 504.4Kb application/pdf View/Open
Back to standard record
Partner of
EuDML logo