Previous |  Up |  Next


maximum likelihood estimation; information divergence; Gaussian process; autoregressive processes
The paper investigates the relation between maximum likelihood and minimum $I$-divergence estimates of unknown parameters and studies the asymptotic behaviour of the likelihood ratio maximum. Observations are assumed to be done in the continuous time.
[1] Anděl J.: Statistical Analysis of Time Series (in Czech). SNTL, Prague 1976
[2] Dzhaparidze K.: Parameter Estimation and Hypothesis Testing in Spectral Analysis of Stationary Time Series. (Springer Series in Statistics.) Springer Verlag, Berlin 1986 MR 0812272 | Zbl 0584.62157
[3] Hájek J.: On the simple linear model for Gaussian processes. In: Trans. of the 2nd Prague Conference, Academia, Prague 1959, pp. 185–197
[4] Hájek J.: On linear statistical problems in stochastic processes. Czechoslovak Math. J. 12 (87) (1962), 404–444 MR 0152090
[5] Michálek J.: Asymptotic Rényi’s rate of Gaussian processes. Problems Control Inform. Theory 19 (1990), 3, 209–227 Zbl 0705.62079
[6] Michálek J.: Maximum likelihood principle and $I$-divergence: observations in discrete time. Kybernetika 34 (1998), 265–288 MR 1640966
[7] Pisarenko V. F.: On absolute continuity of the measures corresponding to a rational spectral density function (in Russian). Teor. Veroyatnost. i Primenen. IV (1959), 481–481
[8] Pisarenko V. F.: On parameter estimations of a Gaussian stationary processes with a spectral density function (in Russian). Lithuanian Math. J. (1962)
[9] Rozanov J. A.: On application a central limit theorem. In: Proc. Fourth Berkeley Symp. Math. Stat. Prob., Berkeley 1961, Vol. 2
Partner of
EuDML logo