Previous |  Up |  Next

Article

Keywords:
minimum $\phi $-divergence estimation; subdivergence; superdivergence; PC simulation; relative efficiency; robustness
Summary:
Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the $\phi$-divergence is always equal to its upper bound, and the minimum $\phi$-divergence estimates are trivial. Broniatowski and Vajda [3] proposed several modifications of the minimum divergence rule to provide a solution to the above mentioned problem. We examine these new estimation methods with respect to consistency, robustness and efficiency through an extended simulation study. We focus on the well-known family of power divergences parametrized by $\alpha \in \mathbb{R}$ in the Gaussian model, and we perform a comparative computer simulation for several randomly selected contaminated and uncontaminated data sets, different sample sizes and different $\phi$-divergence parameters.
References:
[1] M. Broniatowski, A. Keziou: Minimization of $\phi $-divergences on sets of signed measures. Studia Sci. Math. Hungar. 43 (2006), 403-442. MR 2273419 | Zbl 1121.28004
[2] M. Broniatowski, A. Keziou: Parametric estimation and tests through divergences and the duality technique. J. Multivariate Anal. 100 (2009), 16-36. DOI 10.1016/j.jmva.2008.03.011 | MR 2460474 | Zbl 1151.62023
[3] M. Broniatowski, I. Vajda: Several Applications of Divergence Criteria in Continuous Families. Research Report No. 2257. Institute of Information Theory and Automation, Prague 2009.
[4] I. Frýdlová: Minimum Kolmogorov Distance Estimators. Diploma Thesis. Czech Technical University, Prague 2004.
[5] I. Frýdlová: Modified Power Divergence Estimators and Their Performances in Normal Models. In: Proc. FernStat2010, Faculty of Social and Economic Studies UJEP, Ústí n. L. 2010, 28-33.
[6] F. Liese, I. Vajda: On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394-4412. DOI 10.1109/TIT.2006.881731 | MR 2300826
[7] A. Toma, S. Leoni-Aubin: Robust tests based on dual divergence estimators and saddlepoint approximations. J. Multivariate Anal. 101 (2010), 1143-1155. DOI 10.1016/j.jmva.2009.11.001 | MR 2595297 | Zbl 1185.62042
[8] A. Toma, M. Broniatowski: Dual divergence estimators and tests: Robustness results. J. Multivariate Analysis 102 (2011), 20-36. DOI 10.1016/j.jmva.2010.07.010 | MR 2729417 | Zbl 1206.62034
[9] I. Vajda: Theory of Statistical Inference and Information. Kluwer, Boston 1989. Zbl 0711.62002
Partner of
EuDML logo