Previous |  Up |  Next

Article

Title: $(h,\Phi)$-entropy differential metric (English)
Author: Menéndez, M. L.
Author: Morales, D.
Author: Pardo, L.
Author: Salicrú, M.
Language: English
Journal: Applications of Mathematics
ISSN: 0862-7940 (print)
ISSN: 1572-9109 (online)
Volume: 42
Issue: 2
Year: 1997
Pages: 81-98
Summary lang: English
.
Category: math
.
Summary: Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on $(h,\Phi )$-entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic distances in testing statistical hypotheses is illustrated by an example within the Pareto family. We obtain the asymptotic distribution of the information matrices associated with the metric when the parameter is replaced by its maximum likelihood estimator. The relation between the information matrices and the Cramér-Rao inequality is also obtained. (English)
Keyword: $(h,\Phi )$-entropy measures
Keyword: information metric
Keyword: geodesic distance between probability distributions
Keyword: maximum likelihood estimators
Keyword: asymptotic distributions
Keyword: Cramér-Rao inequality.
Keyword: generalized entropies
MSC: 53B20
MSC: 62B10
MSC: 62E20
MSC: 62H12
MSC: 94A17
idZBL: Zbl 0898.62005
idMR: MR1430403
DOI: 10.1023/A:1022214326758
.
Date available: 2009-09-22T17:53:50Z
Last updated: 2020-07-02
Stable URL: http://hdl.handle.net/10338.dmlcz/134347
.
Reference: [1] S. I. Amari: A foundation of information geometry.vol. 66-A, , 1983, pp. 1–10. MR 0747878
Reference: [2] C. Atkinson, A. F. S. Mitchell: Rao’s distance measure.vol. 43, , 1981, pp. 345–365. MR 0665876
Reference: [3] S. Arimoto: Information-theoretical considerations on estimation problems.Information and Control 19 (1971), no. , , 181–194. Zbl 0222.94022, MR 0309224, 10.1016/S0019-9958(71)90065-9
Reference: [4] J. Burbea: Informative geometry of probability spaces.vol. 4, , 1986, pp. 347–378. MR 0867963
Reference: [5] J. Burbea, C. R. Rao: Entropy differential metric, distance and divergence measures in probability spaces: A unified approach.J. Multivariate Analysis 12 (1982a), no. , , 575–596. MR 0680530, 10.1016/0047-259X(82)90065-3
Reference: [6] J. Burbea, C. R. Rao: On the convexity of some divergence measures based on entropy functions.vol. IT-28, , 1982b, pp. 489–495. MR 0672884
Reference: [7] J. Burbea, C. R. Rao: On the convexity of higher order Jensen differences based on entropy functions.vol. IT-28, , 1982c, pp. 961–963. MR 0687297
Reference: [8] N. N. Cencov: Statistical Decision Rules and Optimal Inference.vol. , , 1982, pp. . MR 0645898
Reference: [9] I. Csiszár: Information-type measures of difference of probability distributions and indirect observations.vol. 2, , 1967, pp. 299–318. MR 0219345
Reference: [10] K. Ferentinos, T. Papaioannou: New parametric measures of information.vol. 51, , 1981, pp. 193–208. MR 0686839
Reference: [11] C. Ferreri: Hypoentropy and related heterogeneity divergence measures.vol. 40, , 1980, pp. 55–118. MR 0586545
Reference: [12] J. Havrda, F. Charvat: Concept of structural $\alpha $-entropy.vol. 3, , 1967, pp. 30–35.
Reference: [13] D. Morales, L. Pardo, L. Salicrú, M. L. Menéndez: New parametric measures of information based on generalized $R$-divergences.vol. , , 1993, pp. 473–488. MR 1268437
Reference: [14] R. J. Muirhead: Aspect of Multivariate Statistical Theory.vol. , , 1982, pp. . MR 0652932
Reference: [15] O. Onicescu: Energie Informationnelle.vol. 263, , 1966, pp. 841–842. MR 0229478
Reference: [16] C. R. Rao: Information and accuracy attainable in the estimation of statistical parameters.vol. 37, , 1945, pp. 81–91. MR 0015748
Reference: [17] C. R. Rao: Differential Metrics in probability spaces.vol. , , 1987, pp. .
Reference: [18] A. Rényi: On measures of entropy and information.vol. 1, , 1961, pp. 547–561. MR 0132570
Reference: [19] M. Salicrú, M. L. Menéndez, D. Morales, L. Pardo: Asymptotic distribution of $(h,\Phi )$-entropies.vol. 22(7), , 1993, pp. 2015–2031. MR 1238377
Reference: [20] C. E. Shannon: A mathematical theory of communication.vol. 27, , 1948, pp. 379–423. Zbl 1154.94303, MR 0026286
Reference: [21] B. D. Sharma, I. J. Taneja: Entropy of type $(\alpha , \beta )$ and other generalized measures in information theory.vol. 22, , 1975, pp. 205–215. MR 0398670
Reference: [22] B. D. Sharma, P. Mittal: New non-additive measures of relative information.vol. 2, , 1975, pp. 122–133. MR 0476167
Reference: [23] I. J. Taneja: A study of generalized measures in information theory.vol. , , 1975, pp. .
Reference: [24] I. J. Taneja: On generalized information measures and their applications.vol. 76, , 1989, pp. 327–413.
Reference: [25] I. Vajda, K. Vašek: Majorization, concave entropies and comparison of experiments.vol. 14, , 1985, pp. 105–115. MR 0806056
Reference: [26] J. C. A. Van der Lubbe: $R$-norm information and a general class of measures for certainty and information.M. Sc. Thesis, Delf University of Technology, Dept. E.E., (1977), no. , , . (Dutch)
Reference: [27] J. C. A. Van der Lubbe: A generalized probabilistic theory of the measurement of certainty and information.Ph. D. Thesis, Delf University of Technology, Dept. E.E., (1981), no. , , .
Reference: [28] R. S. Varma: Generalizations of Renyi’s entropy of order $\alpha $.vol. 1, , 1966, pp. 34–48. Zbl 0166.15401, MR 0210515
.

Files

Files Size Format View
AplMat_42-1997-2_1.pdf 399.8Kb application/pdf View/Open
Back to standard record
Partner of
EuDML logo