Previous |  Up |  Next

Article

Title: $\phi $PHI-divergences, sufficiency, Bayes sufficiency, and deficiency (English)
Author: Liese, Friedrich
Language: English
Journal: Kybernetika
ISSN: 0023-5954
Volume: 48
Issue: 4
Year: 2012
Pages: 690-713
Summary lang: English
.
Category: math
.
Summary: The paper studies the relations between $\phi$-divergences and fundamental concepts of decision theory such as sufficiency, Bayes sufficiency, and LeCam's deficiency. A new and considerably simplified approach is given to the spectral representation of $\phi $-divergences already established in Österreicher and Feldman [28] under restrictive conditions and in Liese and Vajda [22], [23] in the general form. The simplification is achieved by a new integral representation of convex functions in terms of elementary convex functions which are strictly convex at one point only. Bayes sufficiency is characterized with the help of a binary model that consists of the joint distribution and the product of the marginal distributions of the observation and the parameter, respectively. LeCam's deficiency is expressed in terms of $\phi $-divergences where $\phi $ belongs to a class of convex functions whose curvature measures are finite and satisfy a normalization condition. (English)
Keyword: divergences
Keyword: sufficiency
Keyword: Bayes sufficiency
Keyword: deficiency
MSC: 62B05
MSC: 62B10
MSC: 62B15
MSC: 62G10
idMR: MR3013395
.
Date available: 2012-11-10T22:03:23Z
Last updated: 2013-09-24
Stable URL: http://hdl.handle.net/10338.dmlcz/143056
.
Reference: [1] M. S. Ali, D. Silvey: A general class of coefficients of divergence of one distribution from another..J. Roy. Statist. Soc. Ser. B 28 (1966), 131-140. Zbl 0203.19902, MR 0196777
Reference: [2] S. Arimoto: Information-theoretical considerations on estimation problems..Inform. Control. 19 (1971), 181-194. Zbl 0222.94022, MR 0309224, 10.1016/S0019-9958(71)90065-9
Reference: [3] A. R. Barron, L. Györfi, E. C. van der Meulen: Distribution estimates consistent in total variation and two types of information divergence..IEEE Trans. Inform. Theory 38 (1990), 1437-1454. 10.1109/18.149496
Reference: [4] A. Berlinet, I. Vajda, E. C. van der Meulen: About the asymptotic accuracy of Barron density estimates..IEEE Trans. Inform. Theory 44 (1990), 999-1009. Zbl 0952.62029, MR 1616679, 10.1109/18.669143
Reference: [5] A. Bhattacharyya: On some analogues to the amount of information and their uses in statistical estimation..Sankhya 8 (1946), 1-14. MR 0020242
Reference: [6] H. Chernoff: A measure of asymptotic efficiency for test of a hypothesis based on the sum of observations..Ann. Math. Statist. 23 (1952), 493-507. MR 0057518, 10.1214/aoms/1177729330
Reference: [7] B. S. Clarke, A. R. Barron: Information-theoretic asymptotics of Bayes methods..IEEE Trans. Inform. Theory 36 (1990), 453-471. Zbl 0709.62008, MR 1053841, 10.1109/18.54897
Reference: [8] I. Csiszár: Eine informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffscher Ketten..Publ. Math. Inst. Hungar. Acad. Sci.8 (1963), 84-108. MR 0164374
Reference: [9] I. Csiszár: Information-type measures of difference of probability distributions and indirect observations..Studia Sci. Math. Hungar. 2, (1967), 299-318. Zbl 0157.25802, MR 0219345
Reference: [10] T. Cover, J. Thomas: Elements of Information Theory..Wiley, New York 1991. Zbl 1140.94001, MR 1122806
Reference: [11] M. H. De Groot: Optimal Statistical Decisions..McGraw Hill, New York 1970. MR 0356303
Reference: [12] D. Feldman, F. Österreicher: A note on $f$-divergences..Studia Sci. Math. Hungar. 24 (1989), 191-200. Zbl 0725.62005, MR 1051149
Reference: [13] A. Guntuboyina: Lower bounds for the minimax risk using $f$-divergences, and applications..IEEE Trans. Inform. Theory 57 (2011), 2386-2399. MR 2809097, 10.1109/TIT.2011.2110791
Reference: [14] C. Guttenbrunner: On applications of the representation of $f$-divergences as averaged minimal Bayesian risk..In: Trans. 11th Prague Conf. Inform. Theory, Statist. Dec. Funct., Random Processes A, 1992, pp. 449-456.
Reference: [15] L. Jager, J. A. Wellner: Goodness-of-fit tests via phi-divergences..Ann. Statist. 35 (2007), 2018-2053. Zbl 1126.62030, MR 2363962, 10.1214/0009053607000000244
Reference: [16] T. Kailath: The divergence and Bhattacharyya distance measures in signal selection..IEEE Trans. Commun. Technol. 15 (1990), 52-60. 10.1109/TCOM.1967.1089532
Reference: [17] S. Kakutani: On equivalence of infinite product measures..Ann. Math. 49 (1948), 214-224. Zbl 0030.02303, MR 0023331, 10.2307/1969123
Reference: [18] S. Kullback, R. Leibler: On information and sufficiency..Ann. Math. Statist. 22 (1951), 79-86. Zbl 0042.38403, MR 0039968, 10.1214/aoms/1177729694
Reference: [19] L. LeCam: Locally asymptotically normal families of distributions..Univ. Calif. Publ. 3, (1960), 37-98. MR 0126903
Reference: [20] L. LeCam: Asymptotic Methods in Statistical Decision Theory..Springer, Berlin 1986.
Reference: [21] F. Liese, I. Vajda: Convex Statistical Distances..Teubner, Leipzig 1987. Zbl 0656.62004, MR 0926905
Reference: [22] F. Liese, I. Vajda: On divergence and informations in statistics and information theory..IEEE Trans. Inform. Theory 52 (2006), 4394-4412. MR 2300826, 10.1109/TIT.2006.881731
Reference: [23] F. Liese, I. Vajda: $f$-divergences: Sufficiency, deficiency and testing of gypotheses..In: Advances in Inequalities from Probability Theory and Statistics. (N. S. Barnett and S. S. Dragomir, eds.), Nova Science Publisher, Inc., New York 2008, pp. 113-149. MR 2459971
Reference: [24] F. Liese, K. J. Miescke: Statistical Decision Theory, Estimation, Testing and Selection..Springer, New York 2008. Zbl 1154.62008, MR 2421720
Reference: [25] K. Matusita: Decision rules based on the distance, for problems of fit, two samples and estimation..Ann. Math. Statist. 26 (1955), 613-640. Zbl 0065.12101, MR 0073899, 10.1214/aoms/1177728422
Reference: [26] D. Mussmann: Decision rules based on the distance, for problems of fit, two samples and estimation..Studia Sci. Math. Hungar. 14 (1979), 37-41.
Reference: [27] X. Nguyen, M. J. Wainwright, M. I. Jordan: On surrogate loss functions and $f$-divergences..Ann. Statist. 37 (2009), 2018-2053. Zbl 1162.62060, MR 2502654, 10.1214/08-AOS595
Reference: [28] F. Österreicher, D. Feldman: Divergenzen von Wahrscheinlichkeitsverteilungen - integralgeometrisch betrachtet..Acta Math. Sci. Hungar. 37 (1981), 329-337. Zbl 0477.60013, MR 0619882, 10.1007/BF01895132
Reference: [29] F. Österreicher, I. Vajda: Statistical information and discrimination..IEEE Trans. Inform. Theory 39 (1993), 1036-1039. Zbl 0792.62005, MR 1237725, 10.1109/18.256536
Reference: [30] J. Pfanzagl: A characterization of sufficiency by power functions..Metrika 21 (1974), 197-199. Zbl 0289.62009, MR 0365797, 10.1007/BF01893900
Reference: [31] H. V. Poor: Robust decision design using a distance criterion..IEEE Trans. Inform. Theory 26 (1980), 578-587. Zbl 0445.62017, MR 0583942
Reference: [32] M. R. C. Read, N. A. C. Cressie: Goodness-of-Fit Statistics for Discrete Multivariate Data..Springer, Berlin 1988. Zbl 0663.62065, MR 0955054
Reference: [33] A. Rényi: On measures of entropy and information..In: Proc. 4th Berkeley Symp. on Probab. Theory and Math. Statist. Berkeley Univ. Press, Berkeley 1961, pp. 547-561. Zbl 0106.33001, MR 0132570
Reference: [34] A. W. Roberts, D. E. Varberg: Convex Functions..Academic Press, New York 1973. Zbl 0289.26012, MR 0442824
Reference: [35] M. J. Schervish: Theory of Statistics..Springer, New York 1995. Zbl 0834.62002, MR 1354146
Reference: [36] C. E. Shannon: A mathematical theory of communication..Bell. Syst. Tech. J. 27 (1948), 379-423, 623-656. Zbl 1154.94303, MR 0026286, 10.1002/j.1538-7305.1948.tb01338.x
Reference: [37] H. Strasser: Mathematical Theory of Statistics..De Gruyter, Berlin 1985. Zbl 0594.62017, MR 0812467
Reference: [38] F. Topsøe: Information-theoretical optimization techniques..Kybernetika 15 (1979), 7-17. MR 0529888
Reference: [39] F. Topsøe: Some inequalities for information divergence and related measures of discrimination..IEEE Trans. Inform. Theory 46 (2000), 1602-1609. MR 1768575, 10.1109/18.850703
Reference: [40] E. Torgersen: Comparison of Statistical Experiments..Cambridge Univ. Press, Cambridge 1991. Zbl 1158.62006, MR 1104437
Reference: [41] I. Vajda: On the $f$-divergence and singularity of probability measures..Periodica Math. Hungar. 2 (1972), 223-234. Zbl 0248.62001, MR 0335163, 10.1007/BF02018663
Reference: [42] I. Vajda: Theory of Statistical Inference and Information..Kluwer Academic Publishers, Dordrecht - Boston - London 1989. Zbl 0711.62002
Reference: [43] I. Vajda: On metric divergences of probability measures..Kybernetika 45 (2009), 885-900. MR 2650071
Reference: [44] I. Vajda: On convergence of information contained in quantized observations..IEEE Trans. Inform. Theory. 48 (1980) 2163-2172. Zbl 1062.94533, MR 1930280, 10.1109/TIT.2002.800497
Reference: [45] I. Vincze: On the concept and measure of information contained in an observation..In: Contribution to Probability. (J. Gani and V. F. Rohatgi, eds.) Academic Press, New York 1981, pp. 207-214. Zbl 0531.62002, MR 0618690
.

Files

Files Size Format View
Kybernetika_48-2012-4_5.pdf 389.2Kb application/pdf View/Open
Back to standard record
Partner of
EuDML logo