Nonparametric Regression Combining Linear Structure
ZHANG Yanli1, SONG Yunquan2, LIN Lu3, WANG Xiuli41. School of Statistics, Shandong University of Finance and Economics, Jinan 250014, Shandong, China; 2. College of Science, China University of Petroleum, Qingdao 266580, Shandong, China; 3. Zhongtai Securities Institute for Financial Studies, Shandong University, Jinan 250100, Shandong, China; 4. School of Mathematics and Statistics, Shandong Normal University, Jinan 250014, Shandong, China
Nonparametric models are popular owing to their flexibility in model building and optimality in estimation. However nonparametric models have the curse of dimensionality and do not use any of the prior information. How to sufficiently mine structure information hidden in the data is still a challenging issue in model building. In this paper, we propose a parametric family of estimators which allows for penalizing deviation from linear structure. The new estimator can automatically capture the linear information underlying regressions function to avoid the curse of dimensionality and offers a smooth choice between the full nonparametric models and parametric models. Besides, the new estimator is the linear estimator when the model has linear structure, and it is the local linear estimator when the model has no linear structure. Compared with the complete nonparametric models, our estimator has smaller bias due to using linear structure information of the data. The new estimator is useful in higher dimensions; the usual nonparametric methods have the curse of dimensionality. Based on the projection framework, the theoretical results give the structure of the new estimator and simulation studies demonstrate the advantages of the new approach.
 Arnold S F. The Theory of Linear Models and Multivariate Analysis [M]. New York : Wiley, 1981.
 Carroll R J, Fan J, Gijbels W, et al. Generalized partially linear single-index models [J]. Journal of the American Statistical Association, 1997, 92: 477-489.
 Härdle W, Müller M, Sperlich S, et al. Nonparametric and Semiparametric Models [M]. New York: Springer-Verlag, 2004.
 Buja A, Hastie T J, Tibshirani R J. Linear smoothers and additive models (with discussion) [J]. Annals of Statistics, 1989, 17: 453-555.
 Nelder J A, Wedderburn R W M. Generalized linear models [J]. Journal of Applied Econometrics, 1972, 5: 99-135.
 Hastie T, Tibshirani R. Generalized Additive Models [M]. London: Chapman and Hall, 1990.
 Fan J. Local linear regression smoothers and their minimax efficiencies [J]. Annals of Statistics, 1993, 21: 196-216.
 Fan J, Gasser T, Gijbels I, et al. Local polynomial regression: Optimal kernels and asymptotic minimax efficiency [J]. Ann Inst Statist Math, 1997, 49: 79-99.
 Stone C. Optimal rates of convergence for nonparametric estimators [J]. Annals of Statistics, 1980, 8: 1348-1360.
 Chu C K, Marron J S. Choosing a kernel regression estimator (with discussion) [J]. Statist Sci, 1991, 6: 404-436.
 Hardle W, Gasser T. Robust nonparametric function fitting [J]. Roy Statist Soc Ser B, 1984, 46: 42-51.
 Stone C. Optimal global rates of convergence for nonparamet-ric regression [J]. Annals of Statistics, 1982, 10: 1040- 1053.
 Studer M, Seifert B, Gasser T. Nonparametric regression penalizing deviations from additivity [J]. Annals of Statistics, 2005, 33: 1295-1329.
 Park J, Seifert B. Local additive estimation [J]. Roy Statist Soc Ser B, 2010, 72: 171-191.
 Lin L, Song Y Q, Liu Z. Local linear-additive estimation for multiple nonparametric regressions [J]. Journal of Multivar-iate Analysis, 2014, 123: 252-269.
 Mammen E, Marron J S, Turlach B, et al. A general projection framework for constrained smoothing [J]. Statist Sci, 2001, 16: 232-248.
 Hurvich C, Simonoff J, Tsai C. Smoothing parameter selec-tion in nonparametric regression using an improved Akaike information criterion [J]. Roy Statist Soc Ser B, Methodol, 1998, 60: 271-293.