Previous |  Up |  Next

Article

Title: Why $L_1$ view and what is next? (English)
Author: Györfi, László
Author: Krzyżak, Adam
Language: English
Journal: Kybernetika
ISSN: 0023-5954
Volume: 47
Issue: 6
Year: 2011
Pages: 840-854
Summary lang: English
.
Category: math
.
Summary: N. N. Cencov wrote a commentary chapter included in the Appendix of the Russian translation of the Devroye and Györfi book [15] collecting some arguments supporting the $L_1$ view of density estimation. The Cencov's work is available in Russian only and it hasn't been translated, so late Igor Vajda decided to translate the Cencov's paper and to add some remarks on the occasion of organizing the session “25 Years of the $L_1$ Density Estimation” at the Prague Stochastics 2010 Symposium. In this paper we complete his task, i. e., we translate the Cencov's chapter and insert some remarks on the related literature focusing primarily on Igor's results. We would also like to acknowledge the excellent work of Alexandre Tsybakov who translated the Devroye and Györfi book in Russian, annotated it with valuable comments and included some related references published in Russian only. (English)
Keyword: Cencov's comments
Keyword: inverse problems in distribution estimation
Keyword: $L_1$ density estimation
Keyword: variational distance
Keyword: $\phi$-divergence
MSC: 62G08
MSC: 62G20
idZBL: Zbl 06047589
idMR: MR2907845
.
Date available: 2011-12-08T09:58:05Z
Last updated: 2013-09-22
Stable URL: http://hdl.handle.net/10338.dmlcz/141728
.
Reference: [1] Abou-Jaoude, S.: Conditions nécessaires et suffisantes de convergence $L_1$ en probabilité de l’histogramme pour une densité.Ann. Inst. H. Poincaré XII (1976), 213–231. MR 0428574
Reference: [2] Barndorff-Nielsen, O.: Information and Exponential Families in Statistical Theory.Wiley, 1978. Zbl 0387.62011, MR 0489333
Reference: [3] Barron, A. R., Györfi, L., Meulen, E. C. van der: Distribution estimation consistent in total variation and two types of information divergence.IEEE Trans. Inform. Theory 38 (1992), 1437–1454. MR 1178189, 10.1109/18.149496
Reference: [4] Cencov, N. N.: Estimation of unknown density function from observations.(in Russian) Trans. SSSR Acad. Sci. 147 (1962), 45–48. MR 0143278
Reference: [5] Cencov, N. N.: Categories of mathematical statistics.(in Russian) Trans. SSSR Acad. Sci. 164 (1965), 511–514. MR 0185710
Reference: [6] Cencov, N. N.: General theory of exponential families of distribution functions.Theory Probab. Appl. 11 (1966), 483–494. MR 0203847
Reference: [7] Cencov, N. N.: Asymmetric distance between distribution functions, entropy and Pithagoras theorem.(in Russian) Math. Notes 4 (1968), 323–332. MR 0239631
Reference: [8] Cencov, N. N.: Statistical Decision Rules and Optimal Inference.(in Russian) Nauka, Moscow 1972. MR 0343398
Reference: [9] Cencov, N. N.: Algebraic foundation of mathematical statistics.Math. Operationsforsch. Statist., Ser. Statistics 9 (1978), 267–276. MR 0512264
Reference: [10] Cencov, N. N.: On basic concepts of mathematical statistics.Banach Center Publ. 6 (1980), 85-94. MR 0599373
Reference: [11] Cencov, N. N.: On correctness of the pointwise estimation problem.(in Russian) Theory Probab. Appl. 26 (1981) 15–31. MR 0605633
Reference: [12] Csiszár, I., Fischer, J.: Informationsentfernungen im Raum der Wahscheinlichkeitsverteilungen.Publ. Math. Inst. Hungar. Acad. Sci. 7 (1962), 159–180. MR 0191734
Reference: [13] Csiszár, I.: Information-type measures of divergence of probability distributions and indirect observations.Studia Sci. Math. Hungar. 2 (1967), 299–318. MR 0219345
Reference: [14] Csiszár, I.: On topological properties of $f$-divergence.Studia Sci. Math. Hungar. 2 (1967), 329–339.
Reference: [15] Devroye, L., Györfi, L.: Nonparametric Density Estimation: The $L_1$ View.Wiley, 1985. Russian translation: Mir, Moscow, 1988 (Translated from English to Russian by A. Tsybakov). MR 0944527
Reference: [16] Devroye, L., Györfi, L.: No empirical measure can converge in the total variation sense for all distribution.Ann. Statist. 18 (1990), 1496–1499. MR 1062724, 10.1214/aos/1176347765
Reference: [17] Frolov, A. S., Cencov, N. N.: Application of dependent observations in the Monte Carlo method for recovering smooth curves.(in Russian) In: Proc. 6th Russian Conference on Probability Theory and Mathematical Statistics, Vilnus 1962, pp. 425–437. MR 0196902
Reference: [18] Györfi, L., Páli, I., Meulen, E. C. van der: There is no universal source code for infinite alphabet.IEEE Trans. Inform. Theory 40 (1994), 267–271. MR 1281931, 10.1109/18.272495
Reference: [19] Györfi, L., Páli, I., Meulen, E. C. van der: On universal noiseless source coding for infinite source alphabets.Europ. Trans. Telecomm. 4 (1993), 9–16.
Reference: [20] Hartigan, J. A.: The likelihood and invariance principles.Annals Math. Statist. 38 (1967), 533–539. MR 0224184
Reference: [21] Ibragimov, I. A., Hasminski, R. Z.: On estimation of density.(in Russian) Scientific Notes of LOMI Seminars 98 (1980), 61–86.
Reference: [22] Kafka, P., Österreicher, F., Vincze, I.: On powers of $f$-divergences defining a distance.Studia Sci. Math. Hungar. 26 (1991), 415–422. MR 1197090
Reference: [23] Kemperman, J. H. B.: An optimum rate of transmitting information.Ann. Math. Statist. 40 (1969), 2156–2177. MR 0252112, 10.1214/aoms/1177697293
Reference: [24] Khosravifard, M., Fooladivanda, D., Gulliver, T. A.: Confliction of the convexity and metric properties in f-divergences.IEICE Trans. Fundamentals E90-A (2007), 1848–1853.
Reference: [25] Kolmogorov, A. L.: Sulla determinazione empirica di una legge di distribuzione.Giornale dell’Istituto Italiano degli Attuari 4 (1933), 83-91. Zbl 0006.17402
Reference: [26] Kriz, T. A., Talacko, J. V.: Equivalence of the maximum likelihood estimator to a minimum entropy estimator.Trab. Estadist. Invest. Oper. 19 (1968), 55-65. Zbl 0169.21401, MR 0238422, 10.1007/BF03001716
Reference: [27] Kullback, S.: A lower bound for discrimination in terms of variation.IEEE Trans. Inform, Theory 13 (1967), 126–127. 10.1109/TIT.1967.1053968
Reference: [28] Kullback, S.: Correction to “A lower bound for discrimination in terms of variation".IEEE Trans. Inform. Theory 16 (1970), 652. 10.1109/TIT.1970.1054514
Reference: [29] Morse, N., Sacksteder, R.: Statistical isomorphism.Ann. Math. Statist. 37 (1966), 203–214. Zbl 0158.37105, MR 0191060, 10.1214/aoms/1177699610
Reference: [30] LeCam, L.: On some asymptotic properties of maximum likelihood estimates and related Bayes estimates.Univ. Calif. Publ. Statist. 1 (1953), 267–329. MR 0054913
Reference: [31] Liese, F., Vajda, I.: Convex Statistical Distances.Teubner, Leipzig 1987. Zbl 0656.62004, MR 0926905
Reference: [32] Morozova, E. A., Cencov, N. N.: Markov maps in noncommutative probability theory and mathematical statistics.(in Russian) In: Proc. 4th Internat. Vilnius Conf. Probability Theory and Mathematical Statistics, VNU Science Press 2 (1987), pp. 287–310. Zbl 0654.46058, MR 0901540
Reference: [33] Nadaraya, E. A.: On nonparametric estimation of Bayes risk in classification problems.(in Russian) Trans. Georgian Acad. Sci. 82 (1976), 277–280. MR 0426276
Reference: [34] Nadaraya, E. A.: Nonparametric Estimation of Probability Density and Regression Curve.(in Russian) Tbilisi State University, Georgia 1983. MR 0783637
Reference: [35] Österreicher, F., Vajda, I.: A new class of metric divergences on probability spaces and its statistical applications.Ann. Inst. Statist. Math. 55 (2003), 639–653. MR 2007803, 10.1007/BF02517812
Reference: [36] Sobol, I. M.: Multidimensional Quadratic Formulas and Haar Functions.(in Russian) Nauka, Moscow 1969. MR 0422968
Reference: [37] Statulavicius, W. W.: On Some Asymptotic Properties of Minimax Density Estimates.(in Russian) PhD. Thesis, Vilnus State University 1986.
Reference: [38] Stratonovich, R. L.: Rate of convergence of probability density estimates.(in Russian) Trans. SSSR Acad. Sci., Ser. Technical Cybernetics 6 (1969), 3–15.
Reference: [39] Toussaint, G. T.: Sharper lower bounds for information in term of variation.IEEE Trans. Inform. Theory 21 (1975), 99–103. MR 0373770, 10.1109/TIT.1975.1055311
Reference: [40] Vajda, I.: Note on discrimination information and variation.IEEE Trans. Inform. Theory IT-16 (1970), 771–773. Zbl 0206.21001, MR 0275575, 10.1109/TIT.1970.1054557
Reference: [41] Vajda, I.: On the f-divergence and singularity of probability measures.Period. Math. Hungar. 2 (1972), 223–234. Zbl 0248.62001, MR 0335163, 10.1007/BF02018663
Reference: [42] Vajda, I.: On metric divergences of probability measures.Kybernetika 45 (2009), 885–900. Zbl 1186.94421, MR 2650071
Reference: [43] Wald, A.: Contributions to the theory of statistical estimation and testing hypotheses.Ann. Math. Statist. 10 (1939), 299–326. Zbl 0024.05405, MR 0000932, 10.1214/aoms/1177732144
.

Files

Files Size Format View
Kybernetika_47-2011-6_3.pdf 324.1Kb application/pdf View/Open
Back to standard record
Partner of
EuDML logo