Previous |  Up |  Next

Article

Title: Rank theory approach to ridge, LASSO, preliminary test and Stein-type estimators: Comparative study (English)
Author: Saleh, A. K. Md. Ehsanes
Author: Navrátil, Radim
Language: English
Journal: Kybernetika
ISSN: 0023-5954 (print)
ISSN: 1805-949X (online)
Volume: 54
Issue: 5
Year: 2018
Pages: 958-977
Summary lang: English
.
Category: math
.
Summary: In the development of efficient predictive models, the key is to identify suitable predictors for a given linear model. For the first time, this paper provides a comparative study of ridge regression, LASSO, preliminary test and Stein-type estimators based on the theory of rank statistics. Under the orthonormal design matrix of a given linear model, we find that the rank based ridge estimator outperforms the usual rank estimator, restricted R-estimator, rank-based LASSO, preliminary test and Stein-type R-estimators uniformly. On the other hand, neither LASSO nor the usual R-estimator, preliminary test and Stein-type R-estimators outperform the other. The region of domination of LASSO over all the R-estimators (except the ridge R-estimator) is the interval around the origin of the parameter space. Finally, we observe that the L$_2$-risk of the restricted R-estimator equals the lower bound on the L$_2$-risk of LASSO. Our conclusions are based on L$_2$-risk analysis and relative L$_2$-risk efficiencies with related tables and graphs. (English)
Keyword: efficiency of LASSO
Keyword: penalty estimators
Keyword: preliminary test
Keyword: Stein-type estimator
Keyword: ridge estimator
Keyword: L$_2$-risk function
MSC: 62G05
MSC: 62J05
MSC: 62J07
idZBL: Zbl 07031754
idMR: MR3893130
DOI: 10.14736/kyb-2018-5-0958
.
Date available: 2018-12-14T08:09:26Z
Last updated: 2020-01-05
Stable URL: http://hdl.handle.net/10338.dmlcz/147537
.
Reference: [1] A, A. Belloni, Chernozhukov, V.: Least squares after model selection in high-dimensional sparse models..Bernoulli 19 (2013), 521-547. MR 3037163, 10.3150/11-bej410
Reference: [2] Breiman, L.: Heuristics of instability and stabilization in model selection..Ann. Statist. 24 (1996), 2350-2383. MR 1425957, 10.1214/aos/1032181158
Reference: [3] Donoho, D. L., Johnstone, I. M.: Minimax estimation via wavelet shrinkage..Ann. Statist. 26 (1994), 879-921. MR 1635414, 10.1214/aos/1024691081
Reference: [4] Draper, N. R., Nostrand, R. C. Van: Ridge regression and James-Stein estimation: review and comments..Technometrics 21 (1979), 451-466. MR 0555086, 10.2307/1268284
Reference: [5] Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties..J. Amer. Statist. Assoc. 96 (2001), 1348-1360. MR 1946581, 10.1198/016214501753382273
Reference: [6] Frank, L. E., Friedman, J. H.: A statistical view of some chemometrics regression tools..Technometrics 35 (1993), 109-135. 10.1080/00401706.1993.10485033
Reference: [7] Hoerl, E., Kennard, R. W.: Ridge regression: Biased estimation for nonorthogonal problems..Technometrics 12 (1970), 55-67. 10.1080/00401706.1970.10488634
Reference: [8] James, W., Stein, C.: Estimation with quadratic loss..In: Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, University of California Press 1961, pp. 361-379. MR 0133191
Reference: [9] Jurečková, J.: Nonparametric estimate of regression coefficients..Ann. Math. Statist. 42 (1971), 1328-1338. MR 0295487, 10.1214/aoms/1177693245
Reference: [10] Hansen, B. E.: The risk of James-Stein and Lasso shrinkage..Econometric Rev. 35 (2015), 456-470. MR 3511027
Reference: [11] Saleh, A. K. Md. E.: Theory of Preliminary test and Stein-Type Estimators with Applications..John Wiley and Sons, New York 2006. MR 2218139, 10.1002/0471773751
Reference: [12] Saleh, A. K. Md. E., Arashi, M., Norouzirad, M., Kibria, B. M. G.: On shrinkage and selection: ANOVA MODEL..J. Statist. Res. 51 (2017), 165-191. MR 3753200
Reference: [13] Stein, C.: Inadmissibility of the usual estimator for the mean of a multivariate normal distribution..In: Proc. Third Berkeley Symposium on Mathematical Statistics and Probability, University of California Press 1956, pp. 197-206. MR 0084922
Reference: [14] Tibshirani, R.: Regression shrinkage and selection via the lasso..J. Royal Statist. Soc., Series B (Methodological) 58 (1996), 267-288. MR 1379242, 10.1111/j.2517-6161.1996.tb02080.x
Reference: [15] Tikhonov, A. N.: Solution of incorrectly formulated problems and the regularization method..Doklady Akademii Nauk SSSR 151 (1963), 501-504. MR 0162377
Reference: [16] Zou, H.: The adaptive lasso and its oracle properties..J. Amer. Statist. Assoc. 101 (2006), 1418-1429. MR 2279469, 10.1198/016214506000000735
Reference: [17] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net..J. Royal Stat. Soc. Ser. B Stat. Methodol. 67 (2005), 301-320. MR 2137327, 10.1111/j.1467-9868.2005.00503.x
.

Files

Files Size Format View
Kybernetika_54-2018-5_6.pdf 529.8Kb application/pdf View/Open
Back to standard record
Partner of
EuDML logo