Previous |  Up |  Next

Article

Title: Scaling of model approximation errors and expected entropy distances (English)
Author: Montúfar, Guido F.
Author: Rauh, Johannes
Language: English
Journal: Kybernetika
ISSN: 0023-5954 (print)
ISSN: 1805-949X (online)
Volume: 50
Issue: 2
Year: 2014
Pages: 234-245
Summary lang: English
.
Category: math
.
Summary: We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant $1-\gamma$. For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected values of the divergence behave in a similar way. These results serve as a reference to rank the approximation capabilities of other statistical models. (English)
Keyword: exponential families
Keyword: KL divergence
Keyword: MLE
Keyword: Dirichlet prior
MSC: 62B10
MSC: 62F15
MSC: 62F25
MSC: 68T30
idZBL: Zbl 06325222
idMR: MR3216992
DOI: 10.14736/kyb-2014-2-0234
.
Date available: 2014-06-06T14:44:12Z
Last updated: 2016-01-03
Stable URL: http://hdl.handle.net/10338.dmlcz/143791
.
Reference: [1] Ay, N.: An information-geometric approach to a theory of pragmatic structuring..Ann. Probab. 30 (2002), 416-436. Zbl 1010.62007, MR 1894113, 10.1214/aop/1020107773
Reference: [2] Drton, M., Sturmfels, B., Sullivant, S.: Lectures on Algebraic Statistics..Birkhäuser, Basel 2009. Zbl 1166.13001, MR 2723140
Reference: [3] Frigyik, B. A., Kapila, A., Gupta, M. R.: Introduction to the Dirichlet Distribution and Related Processes..Technical Report, Department of Electrical Engineering University of Washington, 2010.
Reference: [4] Matúš, F., Ay, N.: On maximization of the information divergence from an exponential family..In: Proc. WUPES'03, University of Economics, Prague 2003, pp. 199-204.
Reference: [5] Matúš, F., Rauh, J.: Maximization of the information divergence from an exponential family and criticality..In: Proc. ISIT, St. Petersburg 2011, pp. 903-907.
Reference: [6] Montúfar, G., Rauh, J., Ay, N.: Expressive power and approximation errors of restricted Boltzmann machines..In: Advances in NIPS 24, MIT Press, Cambridge 2011, pp. 415-423.
Reference: [7] Nemenman, I., Shafee, F., Bialek, W.: Entropy and inference, revisited..In: Advances in NIPS 14, MIT Press, Cambridge 2001, pp. 471-478.
Reference: [8] Rauh, J.: Finding the Maximizers of the Information Divergence from an Exponential Family..Ph.D. Thesis, Universität Leipzig 2011. MR 2817016
Reference: [9] Rauh, J.: Optimally approximating exponential families..Kybernetika 49 (2013), 199-215. Zbl 1283.94027, MR 3085392
Reference: [10] Wolpert, D., Wolf, D.: Estimating functions of probability distributions from a finite set of samples..Phys, Rev. E 52 (1995), 6841-6854. MR 1384746, 10.1103/PhysRevE.52.6841
.

Files

Files Size Format View
Kybernetika_50-2014-2_5.pdf 1.711Mb application/pdf View/Open
Back to standard record
Partner of
EuDML logo