Previous |  Up |  Next

# Article

Full entry | PDF   (0.3 MB)
Keywords:
entropy; asymptotic distribution; maximum likelihood estimators; testing statistical hypotheses
Summary:
To study the asymptotic properties of entropy estimates, we use a unified expression, called the $H^{\varphi _{1},\varphi _{2}}_{h,v}$-entropy. Asymptotic distributions for these statistics are given in several cases when maximum likelihood estimators are considered, so they can be used to construct confidence intervals and to test statistical hypotheses based on one or more samples. These results can also be applied to multinomial populations.
References:
[1] J. Aczél, Z. Daróczy: Characterisierung der Entropien positiver Ordnung und der Shannonschen Entropie. Act. Math. Acad. Sci. Hungar. 14 (1963), 95–121. DOI 10.1007/BF01901932 | MR 0191738
[2] S. Arimoto: Information-theoretical considerations on estimation problems. Information and Control. 19 (1971), 181–194. DOI 10.1016/S0019-9958(71)90065-9 | MR 0309224 | Zbl 0222.94022
[3] M. Belis, S. Guiasu: A quantitative-qualitative measure of information in cybernetics systems. IEEE Trans. Inf. Th. IT-4 (1968), 593–594. DOI 10.1109/TIT.1968.1054185
[5] C. Ferreri: Hypoentropy and related heterogeneity divergence measures. Statistica 40 (1980), 55–118. MR 0586545
[6] P. Gil: Medidas de incertidumbre e información en problemas de decisión estadística. Rev. de la R. Ac. de CC. Exactas, Físicas y Naturales de Madrid LXIX (1975), 549–610. MR 0394956
[7] J. Havrda, F. Charvat: Concept of structural $\alpha$-entropy. Kybernetika 3 (1967), 30–35. MR 0209067
[8] J.N. Kapur: Generalized entropy of order $\alpha$ and type $\beta$. The Math. Seminar 4 (1967), 78–82. MR 0269428
[9] C.F. Picard: The use of information theory in the study of the diversity of biological populations. Proc. Fifth Berk. Symp. IV, 1979, pp. 163–177.
[10] C.R. Rao: Linear statistical inference and its applications. 2nd ed. John Wiley, New York, 1973. MR 0346957 | Zbl 0256.62002
[11] A. Renyi: On the measures of entropy and information. Proc. 4th Berkeley Symp. Math. Statist. and Prob. 1, 1961, pp. 547–561. MR 0132570
[12] A.P. Sant’anna; Taneja, I.J.: Trigonometric entropies, Jensen difference divergences and error bounds. Infor. Sci. 35 (1985), 145–156. MR 0794765 | Zbl 0582.94009
[13] C.E. Shannon: A mathematical theory of communication. Bell. System Tech. J. 27 (1948), 379–423. DOI 10.1002/j.1538-7305.1948.tb01338.x | MR 0026286 | Zbl 1154.94303
[14] B.D. Sharma, D.P. Mittal: New non-additive measures of relative information. J. Comb. Inform. & Syst. Sci. 2 (1975), 122–133. MR 0476167
[15] B.D. Sharma, I.J. Taneja: Entropy of type ($\alpha ,\beta$ and other generalized measures in information theory. Metrika 22 (1975), 205–215. DOI 10.1007/BF01899728 | MR 0398670
[16] B.D. Sharma, I.J. Taneja: Three generalized additive measures of entropy. Elect. Infor. Kybern 13 (1977), 419–433. MR 0530208
[17] I.J. Taneja: A study of generalized measures in information theory. Ph.D. Thesis. University of Delhi, 1975.
[18] R.S. Varma: Generalizations of Renyi’s entropy of order $\alpha$. J. Math. Sci. 1 (1966), 34–48. MR 0210515 | Zbl 0166.15401

Partner of