Title:
|
On generalized entropies, Bayesian decisions and statistical diversity (English) |
Author:
|
Vajda, Igor |
Author:
|
Zvárová, Jana |
Language:
|
English |
Journal:
|
Kybernetika |
ISSN:
|
0023-5954 |
Volume:
|
43 |
Issue:
|
5 |
Year:
|
2007 |
Pages:
|
675-696 |
Summary lang:
|
English |
. |
Category:
|
math |
. |
Summary:
|
The paper summarizes and extends the theory of generalized $\phi $-entropies $H_{\phi }(X)$ of random variables $X$ obtained as $\phi $-informations $I_{\phi }(X;Y)$ about $X$ maximized over random variables $Y$. Among the new results is the proof of the fact that these entropies need not be concave functions of distributions $p_{X}$. An extended class of power entropies $H_{\alpha }(X)$ is introduced, parametrized by $\alpha \in {\mathbb{R}}$, where $H_{\alpha }(X)$ are concave in $p_{X}$ for $\alpha \ge 0$ and convex for $\alpha <0$. It is proved that all power entropies with $\alpha \le 2$ are maximal $\phi $-informations $I_{\phi }(X;X)$ for appropriate $\phi $ depending on $\alpha $. Prominent members of this subclass of power entropies are the Shannon entropy $H_{1}(X)$ and the quadratic entropy $H_{2}(X)$. The paper investigates also the tightness of practically important previously established relations between these two entropies and errors $e(X)$ of Bayesian decisions about possible realizations of $X$. The quadratic entropy is shown to provide estimates which are in average more than 100 % tighter those based on the Shannon entropy, and this tightness is shown to increase even further when $\alpha $ increases beyond $\alpha =2$. Finally, the paper studies various measures of statistical diversity and introduces a general measure of anisotony between them. This measure is numerically evaluated for the entropic measures of diversity $H_1(X)$ and $H_2(X)$. (English) |
Keyword:
|
$\phi $-divergences |
Keyword:
|
$\phi $-informations |
Keyword:
|
power divergences |
Keyword:
|
power entropies |
Keyword:
|
Shannon entropy |
Keyword:
|
quadratic entropy |
Keyword:
|
Bayes error |
Keyword:
|
Simpson diversity |
Keyword:
|
Emlen diversity |
MSC:
|
62C10 |
MSC:
|
94A17 |
idZBL:
|
Zbl 1143.94006 |
idMR:
|
MR2376331 |
. |
Date available:
|
2009-09-24T20:28:14Z |
Last updated:
|
2012-06-06 |
Stable URL:
|
http://hdl.handle.net/10338.dmlcz/135806 |
. |
Reference:
|
[1] Cover T., Thomas J.: Elements of Information Theory.Wiley, New York 1991 Zbl 1140.94001, MR 1122806 |
Reference:
|
[3] Cressie N., Read T. R. C.: Multinomial goodness-of-fit tests.J. Roy. Statist. Soc. Ser. B 46 (1984), 440–464 Zbl 0571.62017, MR 0790631 |
Reference:
|
[4] Csiszár I.: Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten.Publ. Math. Inst. Hungar. Acad. Sci. Ser. A 8 (1963), 85–108 Zbl 0124.08703, MR 0164374 |
Reference:
|
[5] Csiszár I.: Information-type measures of difference of probability distributions and indirect observations.Studia Sci. Math. Hungar. 2 (1967), 299–318 MR 0219345 |
Reference:
|
[6] Csiszár I.: A class of measures of informativity of observation channels.Period. Math. Hungar. 2 (1972), 191–213 Zbl 0247.94018, MR 0335152 |
Reference:
|
[7] Dalton H.: The Inequality of Incomes.Ruthledge & Keagan Paul, London 1925 |
Reference:
|
[8] Devijver P., Kittler J.: Pattern Recognition: A Statistical Approach.Prentice Hall, Englewood Cliffs, NJ 1982 Zbl 0542.68071, MR 0692767 |
Reference:
|
[9] Devroy L., Györfi, L., Lugosi G.: A Probabilistic Theory of Pattern Recognition.Springer, Berlin 1996 MR 1383093 |
Reference:
|
[10] Emlen J. M.: Ecology: An Evolutionary Approach.Adison-Wesley, Reading 1973 |
Reference:
|
[11] Gini C.: Variabilitá e Mutabilitá.Studi Economico-Giuridici della R. Univ. di Cagliari. 3 (1912), Part 2, p. 80 |
Reference:
|
[12] Harremöes P., Topsøe F.: Inequalities between etropy and index of concidence.IEEE Trans. Inform. Theory 47 (2001), 2944–2960 |
Reference:
|
[13] Havrda J., Charvát F.: Concept of structural $a$-entropy.Kybernetika 3 (1967), 30–35 Zbl 0178.22401, MR 0209067 |
Reference:
|
[14] Höffding W.: Masstabinvariante Korrelationstheorie.Teubner, Leipzig 1940 |
Reference:
|
[15] Höffding W.: Stochastische Abhängigkeit und funktionaler Zusammenhang.Skand. Aktuar. Tidskr. 25 (1942), 200–207 Zbl 0027.41401 |
Reference:
|
[16] Kovalevskij V. A.: The problem of character recognition from the point of view of mathematical statistics.Character Readers and Pattern Recognition, 3–30. Spartan Books, New York 1967 |
Reference:
|
[17] Liese F., Vajda I.: Convex Statistical Distances.Teubner, Leipzig 1987 Zbl 0656.62004, MR 0926905 |
Reference:
|
[18] Liese F., Vajda I.: On divergences and informations in statistics and information theory.IEEE Trans. Inform. Theory 52 (2006), 4394–4412 MR 2300826 |
Reference:
|
[19] Marshall A. W., Olkin I.: Inequalities: Theory of Majorization and its Applications.Academic Press, New York 1979 Zbl 1219.26003, MR 0552278 |
Reference:
|
[20] Morales D., Pardo, L., Vajda I.: Uncertainty of discrete stochastic systems.IEEE Trans. Systems, Man Cybernet. Part A 26 (1996), 681–697 |
Reference:
|
[21] Pearson K.: On the theory of contingency and its relation to association and normal correlation.Drapers Company Research Memoirs, Biometric Ser. 1, London 1904 |
Reference:
|
[22] Perez A.: Information-theoretic risk estimates in statistical decision.Kybernetika 3 (1967), 1–21 Zbl 0153.48403, MR 0208775 |
Reference:
|
[23] Rényi A.: On measures of dependence.Acta Math. Acad. Sci. Hungar. 10 (1959), 441–451 Zbl 0091.14403, MR 0115203 |
Reference:
|
[24] Rényi A.: On measures of entropy and information.In: Proc. Fourth Berkeley Symposium on Probab. Statist., Volume 1, Univ. Calif. Press, Berkeley 1961, pp. 547–561 MR 0132570 |
Reference:
|
[25] Sen A.: On Economic Inequality.Oxford Univ. Press, London 1973 |
Reference:
|
[26] Simpson E. H.: Measurement of diversity.Nature 163 (1949), 688 Zbl 0032.03902 |
Reference:
|
[27] Tschuprow A.: Grundbegriffe und Grundprobleme der Korrelationstheorie.Berlin 1925 |
Reference:
|
[28] Vajda I.: Bounds on the minimal error probability and checking a finite or countable number of hypotheses.Information Transmission Problems 4 (1968), 9–17 MR 0267685 |
Reference:
|
[29] Vajda I.: Theory of Statistical Inference and Information.Kluwer, Boston 1989 Zbl 0711.62002 |
Reference:
|
[30] Vajda I., Vašek K.: Majorization concave entropies and comparison of experiments.Problems Control Inform. Theory 14 (1985), 105–115 Zbl 0601.62006, MR 0806056 |
Reference:
|
[31] Vajda I., Zvárová J.: On relations between informations, entropies and Bayesian decisions.In: Prague Stochastics 2006 (M. Hušková and M. Janžura, eds.), Matfyzpress, Prague 2006, pp. 709–718 |
Reference:
|
[32] Zvárová J.: On measures of statistical dependence.Čas. pěst. matemat. 99 (1974), 15–29 Zbl 0282.62048, MR 0365799 |
Reference:
|
[33] Zvárová J.: On medical informatics structure.Internat. J. Medical Informatics 44 (1997), 75–81 |
Reference:
|
[34] Zvárová J.: Information Measures of Stochastic Dependence and Diversity: Theory and Medical Informatics Applications.Doctor of Sciences Dissertation, Academy of Sciences of the Czech Republic, Institute of Informatics, Prague 1998 |
Reference:
|
[35] Zvárová J., Mazura I.: Stochastic Genetics (in Czech).Charles University, Karolinum, Prague 2001 |
Reference:
|
[36] Zvárová J., Vajda I.: On genetic information, diversity and distance.Methods of Inform. in Medicine 2 (2006), 173–179 |
Reference:
|
[37] Zvárová J., Vajda I.: On isotony and anisotony between Gini and Shannon measures of diversity.Preprint, Prague 2006 |
. |