[1] M. S. Ali, D. Silvey:
A general class of coefficients of divergence of one distribution from another. J. Roy. Statist. Soc. Ser. B 28 (1966), 131-140.
MR 0196777 |
Zbl 0203.19902
[3] A. R. Barron, L. Györfi, E. C. van der Meulen:
Distribution estimates consistent in total variation and two types of information divergence. IEEE Trans. Inform. Theory 38 (1990), 1437-1454.
DOI 10.1109/18.149496
[5] A. Bhattacharyya:
On some analogues to the amount of information and their uses in statistical estimation. Sankhya 8 (1946), 1-14.
MR 0020242
[6] H. Chernoff:
A measure of asymptotic efficiency for test of a hypothesis based on the sum of observations. Ann. Math. Statist. 23 (1952), 493-507.
DOI 10.1214/aoms/1177729330 |
MR 0057518
[8] I. Csiszár:
Eine informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffscher Ketten. Publ. Math. Inst. Hungar. Acad. Sci.8 (1963), 84-108.
MR 0164374
[9] I. Csiszár:
Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2, (1967), 299-318.
MR 0219345 |
Zbl 0157.25802
[11] M. H. De Groot:
Optimal Statistical Decisions. McGraw Hill, New York 1970.
MR 0356303
[12] D. Feldman, F. Österreicher:
A note on $f$-divergences. Studia Sci. Math. Hungar. 24 (1989), 191-200.
MR 1051149 |
Zbl 0725.62005
[14] C. Guttenbrunner: On applications of the representation of $f$-divergences as averaged minimal Bayesian risk. In: Trans. 11th Prague Conf. Inform. Theory, Statist. Dec. Funct., Random Processes A, 1992, pp. 449-456.
[16] T. Kailath:
The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. Commun. Technol. 15 (1990), 52-60.
DOI 10.1109/TCOM.1967.1089532
[19] L. LeCam:
Locally asymptotically normal families of distributions. Univ. Calif. Publ. 3, (1960), 37-98.
MR 0126903
[20] L. LeCam: Asymptotic Methods in Statistical Decision Theory. Springer, Berlin 1986.
[23] F. Liese, I. Vajda:
$f$-divergences: Sufficiency, deficiency and testing of gypotheses. In: Advances in Inequalities from Probability Theory and Statistics. (N. S. Barnett and S. S. Dragomir, eds.), Nova Science Publisher, Inc., New York 2008, pp. 113-149.
MR 2459971
[24] F. Liese, K. J. Miescke:
Statistical Decision Theory, Estimation, Testing and Selection. Springer, New York 2008.
MR 2421720 |
Zbl 1154.62008
[26] D. Mussmann: Decision rules based on the distance, for problems of fit, two samples and estimation. Studia Sci. Math. Hungar. 14 (1979), 37-41.
[31] H. V. Poor:
Robust decision design using a distance criterion. IEEE Trans. Inform. Theory 26 (1980), 578-587.
MR 0583942 |
Zbl 0445.62017
[32] M. R. C. Read, N. A. C. Cressie:
Goodness-of-Fit Statistics for Discrete Multivariate Data. Springer, Berlin 1988.
MR 0955054 |
Zbl 0663.62065
[33] A. Rényi:
On measures of entropy and information. In: Proc. 4th Berkeley Symp. on Probab. Theory and Math. Statist. Berkeley Univ. Press, Berkeley 1961, pp. 547-561.
MR 0132570 |
Zbl 0106.33001
[38] F. Topsøe:
Information-theoretical optimization techniques. Kybernetika 15 (1979), 7-17.
MR 0529888
[39] F. Topsøe:
Some inequalities for information divergence and related measures of discrimination. IEEE Trans. Inform. Theory 46 (2000), 1602-1609.
DOI 10.1109/18.850703 |
MR 1768575
[40] E. Torgersen:
Comparison of Statistical Experiments. Cambridge Univ. Press, Cambridge 1991.
MR 1104437 |
Zbl 1158.62006
[42] I. Vajda:
Theory of Statistical Inference and Information. Kluwer Academic Publishers, Dordrecht - Boston - London 1989.
Zbl 0711.62002
[43] I. Vajda:
On metric divergences of probability measures. Kybernetika 45 (2009), 885-900.
MR 2650071
[45] I. Vincze:
On the concept and measure of information contained in an observation. In: Contribution to Probability. (J. Gani and V. F. Rohatgi, eds.) Academic Press, New York 1981, pp. 207-214.
MR 0618690 |
Zbl 0531.62002