Previous |  Up |  Next


$f$-divergence; divergence measures in information theory; Jensen’s inequality; Hellinger and triangular discrimination
In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete random variables under likelihood ratio constraints in terms of the Kullback-Leibler distance. Some particular cases for Hellinger and triangular discimination, $\chi ^2$-distance and Rényi’s divergences, etc. are also considered.
[1] I.  Csiszár: Information measures: A critical survey. Trans. 7th Prague Conf. on Info. Th., Statist. Decis. Funct., Random Processes and 8th European Meeting of Statist., Volume  B, Academia Prague, 1978, pp. 73–86. MR 0519465
[2] I.  Csiszár: Information-type measures of difference of probability functions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299–318. MR 0219345
[3] I.  Csiszár, J.  Körner: Information Theory: Coding Theorem for Discrete Memory-less Systems. Academic Press, New York, 1981. MR 0666545
[4] Maximum Entropy and Bayesian Methods in Applied Statistics. J. H.  Justice (ed.), Cambridge University Press, Cambridge, 1986. MR 0892126 | Zbl 0597.00025
[5] J. N.  Kapur: On the roles of maximum entropy and minimum discrimination information principles in Statistics. Technical Address of the 38th Annual Conference of the Indian Society of Agricultural Statistics, 1984, pp. 1–44. MR 0786009
[6] I.  Burbea, C. R.  Rao: On the convexity of some divergence measures based on entropy functions. IEEE Transactions on Information Theory 28 (1982), 489–495. DOI 10.1109/TIT.1982.1056497 | MR 0672884
[7] R. G.  Gallager: Information Theory and Reliable Communications. J.  Wiley, New York, 1968.
[8] C. E.  Shannon: A mathematical theory of communication. Bull. Sept. Tech. J. 27 (1948), 370–423 and 623–656. MR 0026286 | Zbl 1154.94303
[9] B. R.  Frieden: Image enhancement and restoration. Picture Processing and Digital Filtering, T. S.  Huang (ed.), Springer-Verlag, Berlin, 1975.
[10] R. M.  Leahy, C. E.  Goutis: An optimal technique for constraint-based image restoration and mensuration. IEEE Trans. on Acoustics, Speech and Signal Processing 34 (1986), 1629–1642. DOI 10.1109/TASSP.1986.1165001
[11] S. Kullback: Information Theory and Statistics. J.  Wiley, New York, 1959. MR 0103557 | Zbl 0088.10406
[12] S.  Kullback, R. A.  Leibler: On information and sufficiency. Ann. Math. Statistics 22 (1951), 79–86. DOI 10.1214/aoms/1177729694 | MR 0039968
[13] R.  Beran: Minimum Hellinger distance estimates for parametric models. Ann. Statist. 5 (1977), 445–463. DOI 10.1214/aos/1176343842 | MR 0448700 | Zbl 0381.62028
[14] A.  Renyi: On measures of entropy and information. Proc. Fourth Berkeley Symp. Math. Statist. Prob., Vol. 1, University of California Press, Berkeley, 1961. MR 0132570 | Zbl 0106.33001
[15] S. S. Dragomir, N. M. Ionescu: Some converse of Jensen’s inequality and applications. Anal. Numer. Theor. Approx. 23 (1994), 71–78. MR 1325895
[16] S. S.  Dragomir, C. J.  Goh: A counterpart of Jensen’s discrete inequality for differentiable convex mappings and applications in information theory. Math. Comput. Modelling 24 (1996), 1–11. DOI 10.1016/0895-7177(96)00085-4 | MR 1403525
[17] S. S.  Dragomir, C. J.  Goh: Some counterpart inequalities in for a functional associated with Jensen’s inequality. J. Inequal. Appl. 1 (1997), 311–325. MR 1732628
[18] S. S.  Dragomir, C. J.  Goh: Some bounds on entropy measures in information theory. Appl. Math. Lett. 10 (1997), 23–28. DOI 10.1016/S0893-9659(97)00028-1 | MR 1457634
[19] S. S. Dragomir, C. J. Goh: A counterpart of Jensen’s continuous inequality and applications in information theory. RGMIA Preprint. MR 1977385
[20] M.  Matić: Jensen’s inequality and applications in information theory. Ph.D. Thesis, Univ. of Zagreb, 1999. (Croatian)
[21] S. S.  Dragomir, J. Unde and M.  Scholz: Some upper bounds for relative entropy and applications. Comput. Math. Appl. 39 (2000), 257–266. DOI 10.1016/S0898-1221(00)00089-4 | MR 1753564
[22] J. N.  Kapur: A comparative assessment of various measures of directed divergence. Advances Manag. Stud. 3 (1984), 1–16.
[23] D. S.  Mitrinović, J. E.  Pečarić and A. M.  Fink: Classical and New Inequalities in Analysis. Kluwer Academic Publishers, , 1993. MR 1220224
[24] F.  Topsoe: Some inequalities for information divergence and related measures of discrimination. Res. Rep. Coll. RGMIA 2 (1999), 85–98. MR 1768575
[25] D.  Dacunha-Castelle: Vitesse de convergence pour certains problemes statistiques. In: Ecole d’été de Probabilités de Saint-Flour, VII (Saint-Flour, 1977). Lecture Notes in Math. Vol. 678, Springer, Berlin, 1978, pp. 1–172. MR 0518733 | Zbl 0387.62015
[26] H.  Jeffreys: An invariant form for the prior probability in estimation problems. Proc. Roy. Soc. London Ser.  A 186 (1946), 453–461. DOI 10.1098/rspa.1946.0056 | MR 0017504 | Zbl 0063.03050
[27] A.  Bhattacharyya: On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc. 35 (1943), 99–109. MR 0010358 | Zbl 0063.00364
[28] S. S.  Dragomir: Some inequalities for the Csiszár $\Phi $-divergence. Submitted.
[29] S. S.  Dragomir: A converse inequality for the Csiszár $\Phi $-divergence. Submitted. Zbl 1066.94007
[30] S. S. Dragomir: Some inequalities for $(m,M)$-convex mappings and applications for the Csiszár $\Phi $-divergence in information theory. Math. J. Ibaraki Univ. (Japan) 33 (2001), 35–50. MR 1883258 | Zbl 0996.26019
[31] F.  Liese, I.  Vajda: Convex Statistical Distances. Teubner Verlag, Leipzig, 1987. MR 0926905
[32] I.  Vajda: Theory of Statistical Inference and Information. Kluwer, Boston, 1989. Zbl 0711.62002
Partner of
EuDML logo