Previous |  Up |  Next

Article

Keywords:
Rènyi $\alpha $-entropy; non-extensive entropy of degree $\alpha $; error probability; Bayesian problems; functional convexity
Summary:
Generalized entropic functionals are in an active area of research. Hence lower and upper bounds on these functionals are of interest. Lower bounds for estimating Rényi conditional $\alpha$-entropy and two kinds of non-extensive conditional $\alpha$-entropy are obtained. These bounds are expressed in terms of error probability of the standard decision and extend the inequalities known for the regular conditional entropy. The presented inequalities are mainly based on the convexity of some functions. In a certain sense, they are complementary to generalized inequalities of Fano type.
References:
[1] L. Baladová: Minimum of average conditional entropy for given minimum probability of error. Kybernetika 2 (1966), 416-422. MR 0215641 | Zbl 0199.21502
[2] T. Cover, J. Thomas: Elements of Information Theory. John Wiley & Sons, New York 1991. MR 1122806 | Zbl 1140.94001
[3] I. Csiszár: Axiomatic characterizations of information measures. Entropy 10 (2008), 261-273. DOI 10.3390/e10030261 | Zbl 1179.94043
[4] Z. Daróczy: Generalized information functions. Inform. and Control 16 (1970), 36-51. DOI 10.1016/S0019-9958(70)80040-7 | MR 0272528 | Zbl 0205.46901
[5] M. H. DeGroot: Optimal Statistical Decisions. McGraw-Hill, New York 1970. MR 0356303 | Zbl 1136.62011
[6] D. Erdogmus, J. C. Principe: Lower and upper bounds for misclassification probability based on Rényi's information. J. VLSI Signal Process. 37 (2004), 305-317. DOI 10.1023/B:VLSI.0000027493.48841.39 | Zbl 1073.94507
[7] R. M. Fano: Transmission of Information: A Statistical Theory of Communications. MIT Press and John Wiley & Sons, New York 1961. MR 0134389 | Zbl 0151.24402
[8] M. Feder, N. Merhav: Relations between entropy and error probability. IEEE Trans. Inform. Theory 40 (1994), 259-266. DOI 10.1109/18.272494 | Zbl 0802.94004
[9] S. Furuichi: Information theoretical properties of Tsallis entropies. J. Math. Phys. 47 (2006), 023302. DOI 10.1063/1.2165744 | MR 2208160 | Zbl 1111.94008
[10] M. Gell-Mann, C. Tsallis, eds.: Nonextensive Entropy - Interdisciplinary Applications. Oxford University Press, Oxford 2004. MR 2073730 | Zbl 1127.82004
[11] G. H. Hardy, J. E. Littlewood, G. Polya: Inequalities. Cambridge University Press, London 1934. Zbl 0634.26008
[12] J. Havrda, F. Charvát: Quantification methods of classification processes: concept of structural $\alpha$-entropy. Kybernetika 3 (1967), 30-35. MR 0209067
[13] P. Jizba, T. Arimitsu: The world according to Rényi: thermodynamics of multifractal systems. Ann. Phys. 312 (2004), 17-59. DOI 10.1016/j.aop.2004.01.002 | MR 2067083 | Zbl 1044.82001
[14] R. Kamimura: Minimizing $\alpha$-information for generalization and interpretation. Algorithmica 22 (1998), 173-197. DOI 10.1007/PL00013828 | MR 1637503 | Zbl 0910.68173
[15] A. Novikov: Optimal sequential procedures with Bayes decision rules. Kybernetika 46 (2010), 754-770. MR 2722099 | Zbl 1201.62095
[16] A. Perez: Information-theoretic risk estimates in statistical decision. Kybernetika 3 (1967), 1-21. MR 0208775 | Zbl 0153.48403
[17] A. E. Rastegin: Rényi formulation of the entropic uncertainty principle for POVMs. J. Phys. A: Math. Theor. 43 (2010), 155302. DOI 10.1088/1751-8113/43/15/155302 | MR 2608279 | Zbl 1189.81012
[18] A. E. Rastegin: Entropic uncertainty relations for extremal unravelings of super-operators. J. Phys. A: Math. Theor. 44 (2011), 095303. DOI 10.1088/1751-8113/44/9/095303 | MR 2771869 | Zbl 1211.81021
[19] A. E. Rastegin: Continuity estimates on the Tsallis relative entropy. E-print arXiv:1102.5154v2 [math-ph] (2011). MR 2841748
[20] A. E. Rastegin: Fano type quantum inequalities in terms of $q$-entropies. Quantum Information Processing (2011), doi 10.1007/s11128-011-0347-6.
[21] A. Rényi: On measures of entropy and information. In: Proc. 4th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1961, pp. 547-561. MR 0132570 | Zbl 0106.33001
[22] A. Rényi: On the amount of missing information in a random variable concerning an event. J. Math. Sci. 1 (1966), 30-33. MR 0210263
[23] A. Rényi: Statistics and information theory. Stud. Sci. Math. Hung. 2 (1967), 249-256. MR 0212964 | Zbl 0155.27602
[24] A. Rényi: On some basic problems of statistics from the point of view of information theory. In: Proc. 5th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1967, pp. 531-543. MR 0212963 | Zbl 0201.51905
[25] B. Schumacher: Sending entanglement through noisy quantum channels. Phys. Rev. A 54 (1996), 2614-2628. DOI 10.1103/PhysRevA.54.2614
[26] C. Tsallis: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52 (1988), 479-487. DOI 10.1007/BF01016429 | MR 0968597 | Zbl 1082.82501
[27] I. Vajda: On the statistical decision problem with discrete paprameter space. Kybernetika 3 (1967), 110-126. MR 0215428
[28] I. Vajda: Bounds of the minimal error probability on checking a finite or countable number of hypotheses. Problemy Peredachii Informacii 4 (1968), 9-19 (in Russian); translated as Problems of Information Transmission 4 (1968), 6-14. MR 0267685
[29] K. Życzkowski: Rényi extrapolation of Shannon entropy. Open Sys. Inform. Dyn. 10 (2003), 297-310; corrigendum in the e-print version arXiv:quant-ph/0305062v2. MR 1998623 | Zbl 1030.94022
Partner of
EuDML logo