Previous |  Up |  Next

Article

Keywords:
Kolmogorov distance; $\phi $-divergence; minimum distance estimator; consistency rate; computer simulation
Summary:
The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function $f_0$ on the real line. It shows that the AMDE always exists when the bounded $\phi$-divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, $n^{-1/2}$ consistency rate in any bounded $\phi$-divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family of densities is finite. A simulation experiment empirically studies the performance of the approximate minimum Kolmogorov estimator (AMKE) and some histogram-based variants of approximate minimum divergence estimators, like power type and Le Cam, under six distributions (Uniform, Normal, Logistic, Laplace, Cauchy, Weibull). A comparison with the standard estimators (moment/maximum likelihood/median) is provided for sample sizes $n=10,20,50,120,250$. The simulation analyzes the behaviour of estimators through different families of distributions. It is shown that the performance of AMKE differs from the other estimators with respect to family type and that the AMKE estimators cope more easily with the Cauchy distribution than standard or divergence based estimators, especially for small sample sizes.
References:
[1] Mohamad, D. Al: Towards a better understanding of the dual representation of phi divergences. Statistical Papers (published on-line 2016.) DOI 10.1007/s00362-016-0812-5
[2] Barron, A. R.: The convergence in information of probability density estimators. In: IEEE Int. Symp. Information Theory, Kobe 1988.
[3] Beran, R.: Minimum Hellinger distance estimator for parametric models. Ann. Statist. 5 (1977), 455-463. DOI 10.1214/aos/1176343842 | MR 0448700
[4] Berger, A.: Remark on separable spaces of probability measures. An. Math. Statist. 22 (1951), 119-120. DOI 10.1214/aoms/1177729701 | MR 0039789
[5] Broniatowski, M., Toma, A., Vajda, I.: Decomposable pseudodistances and applications in statistical estimation. J. Statist. Plann. Inference. 142 (2012), 9, 2574-2585. DOI 10.1016/j.jspi.2012.03.019 | MR 2922007
[6] Csiszár, I.: Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit on Markhoffschen Ketten. Publ. Math. Inst. Hungar. Acad. Sci., Ser. A 8 (1963), 84-108. MR 0164374
[7] Csiszár, I.: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299-318. MR 0219345
[8] Frýdlová, I., Vajda, I., Kůs, V.: Modified power divergence estimators in normal model - simulation and comparative study. Kybernetika 48 (2012), 4, 795-808. MR 3013399
[9] Gibbs, A. L., Su, F. E.: On choosing and bounding probability metrics. Int. Statist. Rev. 70 (2002), 419-435. DOI 10.1111/j.1751-5823.2002.tb00178.x
[10] Győrfi, L., Vajda, I., Meulen, E. C. van der: Family of point estimates yielded by $L_1$-consistent density estimate. In: $L_1$-Statistical Analysis and Related Methods (Y. Dodge, ed.), Elsevier, Amsterdam 1992, pp. 415-430. MR 1214843
[11] Győrfi, L., Vajda, I., Meulen, E. C. van der: Minimum Hellinger distance point estimates consistent under weak family regularity. Math. Methods Statist. 3 (1994), 25-45. MR 1272629
[12] Győrfi, L., Vajda, I., Meulen, E. C. van der: Minimum Kolmogorov distance estimates of parameters and parametrized distributions. Metrika 43 (1996), 237-255. DOI 10.1007/bf02613911 | MR 1394805
[13] Hrabáková, J., Kůs, V.: The Consistency and Robustness of Modified Cramér-Von Mises and Kolmogorov-Cramér Estimators. Comm. Statist. - Theory and Methods 42 (2013), 20, 3665-3677. DOI 10.1080/03610926.2013.802806 | MR 3170957
[14] Hrabáková, J., Kůs, V.: Notes on consistency of some minimum distance estimators with simulation results. Metrika 80 (2017), 243-257. DOI 10.1007/s00184-016-0601-0 | MR 3597584
[15] Kafka, P., Ősterreicher, F., Vincze, I.: On powers of $f$-divergences defining a distance. Studia Sci. Mathem. Hungarica 26 (1991), 415-422. MR 1197090
[16] Kůs, V.: Blended $\phi$-divergences with examples. Kybernetika 39 (2003), 43-54. MR 1980123
[17] Kůs, V.: Nonparametric Density Estimates Consistent of the Order of $n^{-1/2}$ in the $L_1$-norm. Metrika 60 (2004), 1-14. DOI 10.1007/s001840300286 | MR 2100162
[18] Kůs, V., Morales, D., Vajda, I.: Extensions of the parametric families of divergences used in statistical inference. Kybernetika 44 (2008), 1, 95-112. MR 2405058
[19] Cam, L. Le: Asymptotic Methods in Statistical Decision Theory. Springer, New York 1986. DOI 10.1007/978-1-4612-4946-7 | MR 0856411
[20] Liese, F., Vajda, I.: Convex Statistical Distances. Teubner, Leipzig 1987. MR 0926905
[21] Liese, F., Vajda, I.: On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394-4412. DOI 10.1109/tit.2006.881731 | MR 2300826
[22] Matusita, K.: Distance and decision rules. Ann. Inst. Statist. Math. 16 (1964), 305-315. DOI 10.1007/bf02868578 | MR 0172419
[23] Ősterreicher, F.: On a class of perimeter-type distances of probability distributions. Kybernetika 32 (1996), 4, 389-393. MR 1420130
[24] Pardo, L.: Statistical Inference Based on Divergence Measures. Chapman and Hall, Boston 2006. DOI 10.1201/9781420034813 | MR 2183173
[25] Pfanzagl, J.: Parametric Statistical Theory. W. de Gruyter, Berlin 1994. DOI 10.1515/9783110889765 | MR 1291393
[26] Vajda, I.: Theory of Statistical Inference and Information. Kluwer, Boston 1989.
Partner of
EuDML logo