Title:
|
Bounds for $f$-divergences under likelihood ratio constraints (English) |
Author:
|
Dragomir, S. S. |
Language:
|
English |
Journal:
|
Applications of Mathematics |
ISSN:
|
0862-7940 (print) |
ISSN:
|
1572-9109 (online) |
Volume:
|
48 |
Issue:
|
3 |
Year:
|
2003 |
Pages:
|
205-223 |
Summary lang:
|
English |
. |
Category:
|
math |
. |
Summary:
|
In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete random variables under likelihood ratio constraints in terms of the Kullback-Leibler distance. Some particular cases for Hellinger and triangular discimination, $\chi ^2$-distance and Rényi’s divergences, etc. are also considered. (English) |
Keyword:
|
$f$-divergence |
Keyword:
|
divergence measures in information theory |
Keyword:
|
Jensen’s inequality |
Keyword:
|
Hellinger and triangular discrimination |
MSC:
|
26D15 |
MSC:
|
94A17 |
idZBL:
|
Zbl 1099.94015 |
idMR:
|
MR1980368 |
DOI:
|
10.1023/A:1026054413327 |
. |
Date available:
|
2009-09-22T18:13:33Z |
Last updated:
|
2020-07-02 |
Stable URL:
|
http://hdl.handle.net/10338.dmlcz/134528 |
. |
Reference:
|
[1] I. Csiszár: Information measures: A critical survey.Trans. 7th Prague Conf. on Info. Th., Statist. Decis. Funct., Random Processes and 8th European Meeting of Statist., Volume B, Academia Prague, 1978, pp. 73–86. MR 0519465 |
Reference:
|
[2] I. Csiszár: Information-type measures of difference of probability functions and indirect observations.Studia Sci. Math. Hungar. 2 (1967), 299–318. MR 0219345 |
Reference:
|
[3] I. Csiszár, J. Körner: Information Theory: Coding Theorem for Discrete Memory-less Systems.Academic Press, New York, 1981. MR 0666545 |
Reference:
|
[4] : Maximum Entropy and Bayesian Methods in Applied Statistics.J. H. Justice (ed.), Cambridge University Press, Cambridge, 1986. Zbl 0597.00025, MR 0892126 |
Reference:
|
[5] J. N. Kapur: On the roles of maximum entropy and minimum discrimination information principles in Statistics.Technical Address of the 38th Annual Conference of the Indian Society of Agricultural Statistics, 1984, pp. 1–44. MR 0786009 |
Reference:
|
[6] I. Burbea, C. R. Rao: On the convexity of some divergence measures based on entropy functions.IEEE Transactions on Information Theory 28 (1982), 489–495. MR 0672884, 10.1109/TIT.1982.1056497 |
Reference:
|
[7] R. G. Gallager: Information Theory and Reliable Communications.J. Wiley, New York, 1968. |
Reference:
|
[8] C. E. Shannon: A mathematical theory of communication.Bull. Sept. Tech. J. 27 (1948), 370–423 and 623–656. Zbl 1154.94303, MR 0026286 |
Reference:
|
[9] B. R. Frieden: Image enhancement and restoration.Picture Processing and Digital Filtering, T. S. Huang (ed.), Springer-Verlag, Berlin, 1975. |
Reference:
|
[10] R. M. Leahy, C. E. Goutis: An optimal technique for constraint-based image restoration and mensuration.IEEE Trans. on Acoustics, Speech and Signal Processing 34 (1986), 1629–1642. 10.1109/TASSP.1986.1165001 |
Reference:
|
[11] S. Kullback: Information Theory and Statistics.J. Wiley, New York, 1959. Zbl 0088.10406, MR 0103557 |
Reference:
|
[12] S. Kullback, R. A. Leibler: On information and sufficiency.Ann. Math. Statistics 22 (1951), 79–86. MR 0039968, 10.1214/aoms/1177729694 |
Reference:
|
[13] R. Beran: Minimum Hellinger distance estimates for parametric models.Ann. Statist. 5 (1977), 445–463. Zbl 0381.62028, MR 0448700, 10.1214/aos/1176343842 |
Reference:
|
[14] A. Renyi: On measures of entropy and information.Proc. Fourth Berkeley Symp. Math. Statist. Prob., Vol. 1, University of California Press, Berkeley, 1961. Zbl 0106.33001, MR 0132570 |
Reference:
|
[15] S. S. Dragomir, N. M. Ionescu: Some converse of Jensen’s inequality and applications.Anal. Numer. Theor. Approx. 23 (1994), 71–78. MR 1325895 |
Reference:
|
[16] S. S. Dragomir, C. J. Goh: A counterpart of Jensen’s discrete inequality for differentiable convex mappings and applications in information theory.Math. Comput. Modelling 24 (1996), 1–11. MR 1403525, 10.1016/0895-7177(96)00085-4 |
Reference:
|
[17] S. S. Dragomir, C. J. Goh: Some counterpart inequalities in for a functional associated with Jensen’s inequality.J. Inequal. Appl. 1 (1997), 311–325. MR 1732628 |
Reference:
|
[18] S. S. Dragomir, C. J. Goh: Some bounds on entropy measures in information theory.Appl. Math. Lett. 10 (1997), 23–28. MR 1457634, 10.1016/S0893-9659(97)00028-1 |
Reference:
|
[19] S. S. Dragomir, C. J. Goh: A counterpart of Jensen’s continuous inequality and applications in information theory.RGMIA Preprint.http://matilda.vu.edu.au/~rgmia/InfTheory/Continuse.dvi. MR 1977385 |
Reference:
|
[20] M. Matić: Jensen’s inequality and applications in information theory.Ph.D. Thesis, Univ. of Zagreb, 1999. (Croatian) |
Reference:
|
[21] S. S. Dragomir, J. Unde and M. Scholz: Some upper bounds for relative entropy and applications.Comput. Math. Appl. 39 (2000), 257–266. MR 1753564, 10.1016/S0898-1221(00)00089-4 |
Reference:
|
[22] J. N. Kapur: A comparative assessment of various measures of directed divergence.Advances Manag. Stud. 3 (1984), 1–16. |
Reference:
|
[23] D. S. Mitrinović, J. E. Pečarić and A. M. Fink: Classical and New Inequalities in Analysis.Kluwer Academic Publishers, , 1993. MR 1220224 |
Reference:
|
[24] F. Topsoe: Some inequalities for information divergence and related measures of discrimination.Res. Rep. Coll. RGMIA 2 (1999), 85–98. MR 1768575 |
Reference:
|
[25] D. Dacunha-Castelle: Vitesse de convergence pour certains problemes statistiques.In: Ecole d’été de Probabilités de Saint-Flour, VII (Saint-Flour, 1977). Lecture Notes in Math. Vol. 678, Springer, Berlin, 1978, pp. 1–172. Zbl 0387.62015, MR 0518733 |
Reference:
|
[26] H. Jeffreys: An invariant form for the prior probability in estimation problems.Proc. Roy. Soc. London Ser. A 186 (1946), 453–461. Zbl 0063.03050, MR 0017504, 10.1098/rspa.1946.0056 |
Reference:
|
[27] A. Bhattacharyya: On a measure of divergence between two statistical populations defined by their probability distributions.Bull. Calcutta Math. Soc. 35 (1943), 99–109. Zbl 0063.00364, MR 0010358 |
Reference:
|
[28] S. S. Dragomir: Some inequalities for the Csiszár $\Phi $-divergence.Submitted. |
Reference:
|
[29] S. S. Dragomir: A converse inequality for the Csiszár $\Phi $-divergence.Submitted. Zbl 1066.94007 |
Reference:
|
[30] S. S. Dragomir: Some inequalities for $(m,M)$-convex mappings and applications for the Csiszár $\Phi $-divergence in information theory.Math. J. Ibaraki Univ. (Japan) 33 (2001), 35–50. Zbl 0996.26019, MR 1883258 |
Reference:
|
[31] F. Liese, I. Vajda: Convex Statistical Distances.Teubner Verlag, Leipzig, 1987. MR 0926905 |
Reference:
|
[32] I. Vajda: Theory of Statistical Inference and Information.Kluwer, Boston, 1989. Zbl 0711.62002 |
. |