Previous |  Up |  Next

Article

Keywords:
information divergence; point estimation; testing statistical hypotheses
Summary:
In the framework of standard model of asymptotic statistics we introduce a global information in the statistical experiment about the occurrence of the true parameter in a given set. Basic properties of this information are established, including relations to the Kullback and Fisher information. Its applicability in point estimation and testing statistical hypotheses is demonstrated.
References:
[1] Bahadur R. R.: Some Limit Theorems in Statistics. SIAM, Philadelphia 1971 MR 0315820 | Zbl 0257.62015
[2] Berk R. H.: Limiting behavior of posterior distributions when the model is incorrect. Ann. Math. Statist. 37 (1966), 51–58 DOI 10.1214/aoms/1177699597 | MR 0189176 | Zbl 0151.23802
[3] Cover T. M., Thomas J. B.: Elements of Information Theory. Wiley, New York 1991 MR 1122806 | Zbl 1140.94001
[4] Groot, M H. De: Uncertainty, information and sequential experiments. Ann. Math. Statist. 33 (1962), 404–419 DOI 10.1214/aoms/1177704567 | MR 0139242
[5] Groot M. H. De: Optimal Statistical Decisions. McGraw Hill, New York 1970 MR 0356303
[6] Gallager R. C.: Information Theory and Reliable Communication. Wiley, New York 1968 Zbl 0295.94001
[7] Huber P. J.: Robust Statistics. Wiley, New York 1981 MR 0606374
[8] Kullback S.: Information Theory and Statistics. Wiley, New York 1959 MR 0103557 | Zbl 0897.62003
[9] Lehman E. L.: Testing Statistical Hypotheses. Second edition. Wiley, New York 1986
[10] Liese F., Vajda I.: Necessary and sufficient conditions for consistency of generalized $M$-estimates. Metrika 42 (1995), 93–114 DOI 10.1007/BF01894328 | Zbl 0834.62024
[11] Lindley D. V.: On the measure of the information provided by an experiment. Ann. Math. Statist. 27 (1956), 986–1005 DOI 10.1214/aoms/1177728069
[12] Loéve M.: Probability Theory. Wiley, New York 1963
[13] Perlman M. D.: On the strong consistency of approximate maximum likelihood estimators. In: Proc. VIth Berkeley Symp. Prob. Math. Statist., 1972, pp. 263–281
[14] Pfanzagl J.: On the measurability and consistency of minimum contrast estimators. Metrika 14 (1969), 249–272 DOI 10.1007/BF02613654
[15] Pfanzagl J.: Parametric Statistical Theory. Walter de Guyter, Berlin 1994 Zbl 0807.62016
[16] Rao C. R.: Linear Statistical Inference and its Applications. Wiley, New York 1965 Zbl 0256.62002
[17] A: Rényi: On the amount of information concerning an unknown parameter in a sequence of observations. Publ. Math. Inst. Hungar. Acad. Sci., Sec. A 9 (1964), 617–625
[18] Rényi A.: Statistics and information theory. Stud. Scient. Math. Hungar. 2 (1967), 249–256 Zbl 0155.27602
[19] Rockafellar R. T.: Convex Analysis. Princeton Univ. Press, Princeton, N. J. 1970
[20] Strasser H.: Consistency of maximum likelihood and Bayes estimates. Ann. Statist. 9 (1981), 1107–1113 DOI 10.1214/aos/1176345590 | Zbl 0483.62019
[21] Strasser A.: Mathematical Theory of Statistics. DeGruyter, Berlin 1985 Zbl 0594.62017
[22] Torgersen E.: Comparison of Statistical Experiments. Cambridge Univ. Press, Cambridge 1991 Zbl 1158.62006
[23] Vajda I.: Conditions equivalent to consistency of approximate MLE’s for stochastic processes. Stochastic Process. Appl. 56 (1995), 35–56 DOI 10.1016/0304-4149(94)00069-6 | Zbl 0817.62073
Partner of
EuDML logo