Previous |  Up |  Next

Article

Keywords:
universal codes; typical sampling sets; entropy estimation; asymptotic equipartition property; ergodic theory
Summary:
We lift important results about universally typical sets, typically sampled sets, and empirical entropy estimation in the theory of samplings of discrete ergodic information sources from the usual one-dimensional discrete-time setting to a multidimensional lattice setting. We use techniques of packings and coverings with multidimensional windows to construct sequences of multidimensional array sets which in the limit build the generated samples of any ergodic source of entropy rate below an $h_{0}$ with probability one and whose cardinality grows at most at exponential rate $h_{0}$.
References:
[1] Bjelaković, I., Krüger, T., Siegmund-Schultze, R., Szkoła, A.: The Shannon-McMillan theorem for ergodic quantum lattice systems. Invent. Math. 155 (2004) (1), 203-222. DOI 10.1007/s00222-003-0318-3 | MR 2025304 | Zbl 1092.81043
[2] Breiman, L.: The individual ergodic theorem of information theory. Ann. Math. Statist. 28 (1957), 809-811. DOI 10.1214/aoms/1177706899 | MR 0092710 | Zbl 0092.34001
[3] Kieffer, J. C.: A generalized Shannon-McMillan theorem for the action of an amenable group on a probability space. Ann. Probab. 3 (1975), 6, 1031-1037. DOI 10.1214/aop/1176996230 | MR 0393422
[4] Lempel, A., Ziv, J.: Compression of two-dimensional data. IEEE Trans. Inform. Theory 32 (1986), 1, 2-8. DOI 10.1109/TIT.1986.1057132
[5] Lindenstrauss, E.: Pointwise theorems for amenable groups. Invent. Math. 146 (2001), 2, 259-295. DOI 10.1007/s002220100162 | MR 1865397 | Zbl 1038.37004
[6] McMillan, B.: The basic theorems of information theory. Ann. Math. Statist. 24 (1953), 2, 196-219. DOI 10.1214/aoms/1177729028 | MR 0055621 | Zbl 0050.35501
[7] Ornstein, D. S., Weiss, B.: The Shannon-McMillan-Breiman theorem for a class of amenable groups. Israel J. Math. 44 (1983), 1, 53-60. DOI 10.1007/BF02763171 | MR 0693654 | Zbl 0516.28020
[8] Ornstein, D. S., Weiss, B.: How sampling reveals a process. Ann. Probab. 18 (1990), 3, 905-930. DOI 10.1214/aop/1176990729 | MR 1062052 | Zbl 0709.60036
[9] Shannon, C. E.: A mathematical theory of communication. Bell Syst. Techn. J. 27 (1948), 1, 379-423, 623-656. DOI 10.1002/j.1538-7305.1948.tb01338.x | MR 0026286 | Zbl 1154.94303
[10] Schmidt, K.: A probabilistic proof of ergodic decomposition. Sankhya: Indian J. Statist, Ser. A 40 (1978), 1, 10-18. MR 0545459 | Zbl 0412.60004
[11] Shields, P.: The Ergodic Theory of Discrete Sample Paths. Amer. Math. Soc., Graduate Stud. Math. 13 (1996). DOI 10.1090/gsm/013 | MR 1400225 | Zbl 0879.28031
[12] Welch, T. A.: A technique for high-performance data compression. Computer 17 (1984), 6, 8-19. DOI 10.1109/MC.1984.1659158
[13] Ziv, J., Lempel, A.: A universal algorithm for sequential data compression. IEEE Trans. Inform. Theory 23 (1977), 3, 337-343. DOI 10.1109/TIT.1977.1055714 | MR 0530215 | Zbl 0379.94010
[14] Ziv, J., Lempel, A.: Compression of individual sequences via variable-rate coding. IEEE Trans. Inform. Theory 24 (1978), 5, 530-536. DOI 10.1109/TIT.1978.1055934 | MR 0507465 | Zbl 0392.94004
Partner of
EuDML logo