Previous |  Up |  Next

Article

Summary:
Let $X_1,\ldots,X_N$ be a finite random sequence with the expectation $EX_t=\alpha\varphi_t(1\leq t\leq N)$ and with the regular covariance matrix $\bold G$. The matrix $\bold G$ and the values of $\varphi_t$ are supposed to be known; $\alpha$ is an unknown parameter. The least squares estimate $\hat{\alpha}$ and the best linear unbiased estimate (BLUE) $\tilde{\alpha}$ of the parameter $\alpha$ are mentioned. The efficiency $\ell_N=var\ \hat{\alpha}/var\ \tilde{\alpha}$ is derived. The exact value of $\ell_N$ is given for cases when $X_1,\ldots,X_N$ is a finite part of the autoregressive series of the first and of the second order and $\varphi_t\equiv 1$ and $\varphi_t =t\ (1 \leq t\leq N)$ and for the autoregressive series of the $n$-th order with $\varphi_t\equiv 1$. The efficiency and the asymptotic efficiency of the BLUE $\tilde{\alpha}$ in cases when $\bold G$ is not true covariance matrix is also considered.
References:
[1] U. Grenander M. Rosenblatt: Statistical analysis of stationary time series. New York, 1957. MR 0084975
[2] J. Hájek: On linear statistical problems in stochastic processes. Czech. Math. J. 12 (87), 1962, 404-444. MR 0152090
[3] E. J. Hannan: Анализ временных рядов. Москва 1964. Zbl 0116.11402
[4] T. A. Magness J. В. McGuire: Comparison of least squares and minimum variance estimates of regression parameters. Ann. Math. Stat. 33, 1962, 462-470. DOI 10.1214/aoms/1177704573 | MR 0141201
[5] G. S. Watson: Linear least squares regression. Ann. Math. Stat. 38, 1967, 1679-1699. DOI 10.1214/aoms/1177698603 | MR 0219206 | Zbl 0155.26801
Partner of
EuDML logo