Previous |  Up |  Next

Article

Keywords:
Fisher information; location parameter; Pitman estimators
Summary:
Zamir showed in 1998 that the Stam classical inequality for the Fisher information (about a location parameter) $$ 1/I(X+Y)\geq 1/I(X)+1/I(Y) $$ for independent random variables $X$, $Y$ is a simple corollary of basic properties of the Fisher information (monotonicity, additivity and a reparametrization formula). The idea of his proof works for a special case of a general (not necessarily location) parameter. Stam type inequalities are obtained for the Fisher information in a multivariate observation depending on a univariate location parameter and for the variance of the Pitman estimator of the latter.
References:
[1] Carlen, E. A.: Superadditivity of Fisher's information and logarithmic Sobolev inequalities. J. Funct. Anal. 101 (1991), 194-211. DOI 10.1016/0022-1236(91)90155-X | MR 1132315 | Zbl 0732.60020
[2] Ibragimov, I. A., Khas'minskij, R. Z.: Statistical Estimation. Asymptotic Theory. Springer New York (1981). MR 0620321 | Zbl 0467.62026
[3] Kagan, A., Landsman, Z.: Statistical meaning of Carlen's superadditivity of the Fisher information. Statist. Probab. Lett. 32 (1997), 175-179. DOI 10.1016/S0167-7152(96)00070-3 | MR 1436863 | Zbl 0874.60002
[4] Kagan, A.: An inequality for the Pitman estimators related to the Stam inequality. Sankhya A64 (2002), 282-292. MR 1981759 | Zbl 1192.62099
[5] Kagan, A., Shepp, L. A.: A sufficiency paradox: an insufficient statistic preserving the Fisher information. Amer. Statist. 59 (2005), 54-56. DOI 10.1198/000313005X21041 | MR 2113195
[6] Kagan, A., Yu, T., Barron, A., Madiman, M.: Contribution to the theory of Pitman estimators. Submitted.
[7] Madiman, M., Barron, A.: The monotonicity of information in the central limit theorem and entropy power inequalities. Preprint Dept. of Statistics, Yale University (2006). MR 2128239
[8] Shao, J.: Mathematical Statistics, 2nd ed. Springer New York (2003). MR 2002723 | Zbl 1018.62001
[9] Stam, A. J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inform. and Control 2 (1959), 101-112. DOI 10.1016/S0019-9958(59)90348-1 | MR 0109101 | Zbl 0085.34701
[10] Zamir, R.: A proof of the Fisher information inequality via a data processing argument. IEEE Trans. Inf. Theory 44 (1998), 1246-1250. DOI 10.1109/18.669301 | MR 1616672 | Zbl 0901.62005
Partner of
EuDML logo