# Article

Full entry | PDF   (0.9 MB)
Keywords:
divergences; metric divergences; families of $f$-divergences
Summary:
We propose a simple method of construction of new families of $\phi$%-divergences. This method called convex standardization is applicable to convex and concave functions $\psi(t)$ twice continuously differentiable in a neighborhood of $t=1$ with nonzero second derivative at the point $t=1$. Using this method we introduce several extensions of the LeCam, power, $% \chi^a$ and Matusita divergences. The extended families are shown to connect smoothly these divergences with the Kullback divergence or they connect various pairs of these particular divergences themselves. We investigate also the metric properties of divergences from these extended families.
References:
[1] Beirlant J., Devroye L., Győrfi, L., Vajda I.: Large deviations of divergence measures of partitions. J. Statist. Plann. Inference 93 (2001), 1–16 MR 1822385
[2] Csiszár I., Fisher J.: Informationsentfernungen im Raum der Narcheinlichkeitsverteilungen. Publ. Math. Inst. Hungar. Acad. Sci. 7 (1962), 159–180 MR 0191734
[3] Győrfi L., Vajda I.: Asymptotic distributions for goodness-of-fit statistics in a sequence of multinomial models. Statist. Probab. Lett. 56 (2002), 57–67 MR 1881531
[4] Hobza T., Molina, I., Vajda I.: On convergence of Fisher’s information in continuous models with quantized observations. Test 4 (2005), 151–179
[5] Kafka P., Österreicher, F., Vincze I.: On powers of Csiszár $f$-divergences defining a distance. Stud. Sci. Math. Hungar. 26 (1991), 415–422 MR 1197090
[6] Kullback S., Leibler R.: On information and sufficiency. Ann. Math. Statist. 22 (1951), 79–86 MR 0039968 | Zbl 0042.38403
[7] Kullback S.: Statistics and Information Theory. Wiley, New York 1957
[8] Kůs V.: Blended $\phi$-divergences with examples. Kybernetika 39 (2003), 43–54 MR 1980123
[9] Cam L. Le: Asymptotic Methods in Statistical Decision Theory. Springer, New York 1986 MR 0856411 | Zbl 0605.62002
[10] Liese F., Vajda I.: Convex Statistical Distances. Teubner, Leipzig 1987 MR 0926905 | Zbl 0656.62004
[11] Liese F., Vajda I.: On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394–4412 MR 2300826
[12] Lindsay B. G.: Efficiency versus robustness: The case of minimum Hellinger distance and other methods. Ann. Statist. 22 (1994), 1081–1114 MR 1292557
[13] Morales D., Pardo, L., Vajda I.: Some new statistics for testing hypotheses in parametric models. J. Multivariate Anal. 62 (1997), 137–168 MR 1467878 | Zbl 0877.62020
[14] Morales D., Pardo, L., Vajda I.: Limit laws for disparities of spacings. Nonparametric Statistics 15 (2003), 325–342 MR 1987078 | Zbl 1024.62020
[15] Morales D., Pardo, L., Vajda I.: On the optimal number of classes in the Pearson goodness-of-fit tests. Kybernetika 41 (2005), 677–698 MR 2193859
[16] Österreicher F.: On a class of perimeter-type distances of probability distributions. Kybernetika 32 (1996), 389–393 MR 1420130 | Zbl 0897.60015
[17] Österreicher F., Vajda I.: A new class of metric divergences on probability spaces and its applicability in statistics. Ann. Inst. Statist. Math. 55 (2003), 639–653 MR 2007803 | Zbl 1052.62002
[18] Pardo L.: Statistical Inference Based on Divergence Measures. Chapman&Hall, London 2006 MR 2183173 | Zbl 1118.62008
[19] Read T. C. R., Cressie N. A.: Goodness-of-fit Statistics for Discrete Multivariate Data. Springer, Berlin 1988 MR 0955054 | Zbl 0663.62065
[20] Vajda I.: $\chi ^{a}$-divergence and generalized Fisher information. In: Trans. 6th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Academia, Prague 1973, pp. 872–886 MR 0356302 | Zbl 0297.62003
[21] Vajda I.: Theory of Statistical Inference and Information. Kluwer, Boston 1989 Zbl 0711.62002
[22] Vajda I., Kůs V.: Relations Between Divergences, Total Variations and Euclidean Distances. Research Report No. 1853, Institute of Information Theory, Prague 1995
[23] Vajda I., Meulen E. C. van der: Optimization of Barron density stimates. IEEE Trans. Inform. Theory 47 (2001), 1867–1883 MR 1842524

Partner of