Model selection and estimation of a component in additive regression
ESAIM: Probability and Statistics, Tome 18 (2014), pp. 77-116.

Let Y ∈ ℝn be a random vector with mean s and covariance matrix σ2PntPn where Pn is some known n × n-matrix. We construct a statistical procedure to estimate s as well as under moment condition on Y or Gaussian hypothesis. Both cases are developed for known or unknown σ2. Our approach is free from any prior assumption on s and is based on non-asymptotic model selection methods. Given some linear spaces collection {Sm, m ∈ ℳ}, we consider, for any m ∈ ℳ, the least-squares estimator ŝm of s in Sm. Considering a penalty function that is not linear in the dimensions of the Sm's, we select some m̂ ∈ ℳ in order to get an estimator ŝ with a quadratic risk as close as possible to the minimal one among the risks of the ŝm's. Non-asymptotic oracle-type inequalities and minimax convergence rates are proved for ŝ. A special attention is given to the estimation of a non-parametric component in additive models. Finally, we carry out a simulation study in order to illustrate the performances of our estimators in practice.

DOI : 10.1051/ps/2012028
Classification : 62G08
Mots-clés : model selection, nonparametric regression, penalized criterion, oracle inequality, correlated data, additive regression, minimax rate
@article{PS_2014__18__77_0,
     author = {Gendre, Xavier},
     title = {Model selection and estimation of a component in additive regression},
     journal = {ESAIM: Probability and Statistics},
     pages = {77--116},
     publisher = {EDP-Sciences},
     volume = {18},
     year = {2014},
     doi = {10.1051/ps/2012028},
     mrnumber = {3143734},
     language = {en},
     url = {http://archive.numdam.org/articles/10.1051/ps/2012028/}
}
TY  - JOUR
AU  - Gendre, Xavier
TI  - Model selection and estimation of a component in additive regression
JO  - ESAIM: Probability and Statistics
PY  - 2014
SP  - 77
EP  - 116
VL  - 18
PB  - EDP-Sciences
UR  - http://archive.numdam.org/articles/10.1051/ps/2012028/
DO  - 10.1051/ps/2012028
LA  - en
ID  - PS_2014__18__77_0
ER  - 
%0 Journal Article
%A Gendre, Xavier
%T Model selection and estimation of a component in additive regression
%J ESAIM: Probability and Statistics
%D 2014
%P 77-116
%V 18
%I EDP-Sciences
%U http://archive.numdam.org/articles/10.1051/ps/2012028/
%R 10.1051/ps/2012028
%G en
%F PS_2014__18__77_0
Gendre, Xavier. Model selection and estimation of a component in additive regression. ESAIM: Probability and Statistics, Tome 18 (2014), pp. 77-116. doi : 10.1051/ps/2012028. http://archive.numdam.org/articles/10.1051/ps/2012028/

[1] H. Akaike, Statistical predictor identification. Ann. Inst. Stat. Math. 22 (1970) 203-217. | MR | Zbl

[2] S. Arlot, Choosing a penalty for model selection in heteroscedastic regression. Preprint arXiv:0812.3141v2 (2010).

[3] S. Arlot and P. Massart, Data-driven calibration of penalties for least-squares regression. J. Machine Learn. Research 10 (2009) 245-279.

[4] Y. Baraud, Model selection for regression on a fixed design. Probab. Theory Related Fields 117 (2000) 467-493. | MR | Zbl

[5] Y. Baraud, Model selection for regression on a random design. ESAIM: Probab. Statist. 6 (2002) 127-146. | Numdam | MR | Zbl

[6] Y. Baraud, F. Comte and G. Viennet, Adaptive estimation in autoregression or β-mixing regression via model selection. Ann. Stat. 29 (2001) 839-875. | MR | Zbl

[7] L. Birgé and P. Massart, From model selection to adaptive estimation. Festschrift for Lucien Lecam: Research Papers in Probab. Stat. (1997) 55-87. | MR | Zbl

[8] L. Birgé and P. Massart, Minimum contrast estimators on sieves: exponential bounds and rates of convergence. Bernoulli 4 (1998) 329-375. | MR | Zbl

[9] L. Birgé and P. Massart, An adaptive compression algorithm in Besov spaces. Constr. Approx. 16 (2000) 1-36. | MR | Zbl

[10] L. Birgé and P. Massart. Gaussian model selection. J. Europ. Math. Soc. 3 (2001) 203-268. | MR | Zbl

[11] L. Birgé and P. Massart, Minimal penalties for gaussian model selection. Probab. Theory Related Fields 138 (2007) 33-73. | MR | Zbl

[12] L. Breiman and J.H. Friedman, Estimating optimal transformations for multiple regression and correlations (with discussion). J. Amer. Stat. Assoc. 80 (1985) 580-619. | MR | Zbl

[13] E. Brunel and F. Comte, Adaptive nonparametric regression estimation in presence of right censoring. Math. Methods Stat. 15 (2006) 233-255. | MR

[14] E. Brunel and F. Comte, Model selection for additive regression models in the presence of censoring, chapt. 1 in “Mathematical Methods in Survival Analysis, Reliability and Quality of Life”, Wiley (2008) 17-31. | MR

[15] A. Buja, T.J. Hastie and R.J. Tibshirani, Linear smoothers and additive models (with discussion). Ann. Stat. 17 (1989) 453-555. | MR | Zbl

[16] F. Comte and Y. Rozenholc, Adaptive estimation of mean and volatility functions in (auto-)regressive models. Stoch. Process. Appl. 97 (2002) 111-145. | MR | Zbl

[17] X. Gendre, Simultaneous estimation of the mean and the variance in heteroscedastic gaussian regression. Electron. J. Stat. 2 (2008) 1345-1372. | MR

[18] W. Härdle, M. Müller, S. Sperlich and A. Werwatz. Nonparametric and Semiparametric Models. Springer (2004). | MR | Zbl

[19] T.J. Hastie and R.J. Tibshirani, Generalized additive models. Chapman and Hall (1990). | MR | Zbl

[20] R.A. Horn and C.R. Johnson, Matrix analysis. Cambridge University Press (1990). | MR | Zbl

[21] B. Laurent, J.M. Loubes and C. Marteau, Testing inverse problems: a direct or an indirect problem? J. Stat. Plann. Inference 141 (2011) 1849-1861. | MR

[22] B. Laurent and P. Massart, Adaptive estimation of a quadratic functional by model selection. Ann. Stat. 28 (2000) 1302-1338. | MR | Zbl

[23] W. Leontief, Introduction to a theory of the internal structure of functional relationships. Econometrica 15 (1947) 361-373. | MR | Zbl

[24] O. Linton and J.P. Nielsen, A kernel method of estimating structured nonparametric regression based on marginal integration. Biometrika 82 (1995) 93-101. | MR | Zbl

[25] C.L. Mallows, Some comments on cp. Technometrics 15 (1973) 661-675,. | Zbl

[26] E. Mammen, O. Linton and J.P. Nielsen, The existence and asymptotic properties of a backfitting projection algorithm under weak conditions. Ann. Stat. 27 (1999) 1443-1490. | MR | Zbl

[27] P. Massart, Concentration inequalities and model selection, in vol. 1896 of Lect. Notes Math. Lectures from the 33rd Summer School on Probability Theory held in Saint-Flour, July 6-23 (2003). Springer, Berlin (2007). | MR | Zbl

[28] A.D.R. Mcquarrie and C.L. Tsai, Regression and times series model selection. River Edge, NJ (1998). | MR | Zbl

[29] L. Meier, S. Van De Geer and P. Bühlmann, High-dimensional additive modeling. Ann. Stat. 37 (2009) 3779-3821. | MR

[30] J. Opsomer and D. Ruppert, Fitting a bivariate additive model by local polynomial regression. Ann. Stat. 25 (1997) 186-211. | MR | Zbl

[31] V.V. Petrov, Limit theorems of probability theory: sequences of independent random variables. Oxford Studies Probab. 4 (1995). | MR | Zbl

[32] P.D. Ravikumar, H. Liu, J.D. Lafferty and L.A. Wasserman, Sparse additive models. J. Royal Statist. Soc. 71 (2009) 1009-1030. | MR

[33] S. Robin, F. Rodolphe and S. Schbath, DNA, Words and Models. Cambridge University Press (2005). | MR | Zbl

[34] D. Ruppert and M.P. Wand, Multivariate locally weighted least squares regression. Ann. Stat. 22 (1994) 1346-1370. | MR | Zbl

[35] H. Scheffé, The analysis of variance. Wiley-Interscience (1959). | Zbl

[36] E. Severance-Lossin and S. Sperlich, Estimation of derivatives for additive separable models. Statististics 33 (1999) 241-265. | MR | Zbl

[37] C.J. Stone, Additive regression and other nonparametric models. Ann. Stat. 14 (1985) 590-606. | MR | Zbl

[38] D. Tjøstheim and B. Auestad, Nonparametric identification of nonlinear time series: Selecting significant lags. J. Amer. Stat. Assoc. 89 (1994) 1410-1430. | MR | Zbl

[39] B. Von Bahr and C.G. Esseen, Inequalities for the rth absolute moment of a sum of random variables 1 ≤ r ≤ 2 . Ann. Math. Stat. 36 (1965) 299-303. | MR | Zbl

Cité par Sources :