Model selection for estimating the non zero components of a gaussian vector
ESAIM: Probability and Statistics, Tome 10 (2006), pp. 164-183.

We propose a method based on a penalised likelihood criterion, for estimating the number on non-zero components of the mean of a gaussian vector. Following the work of Birgé and Massart in gaussian model selection, we choose the penalty function such that the resulting estimator minimises the Kullback risk.

DOI : 10.1051/ps:2006004
Classification : 62G05, 62G09
Mots clés : Kullback risk, model selection, penalised likelihood criteria
@article{PS_2006__10__164_0,
     author = {Huet, Sylvie},
     title = {Model selection for estimating the non zero components of a gaussian vector},
     journal = {ESAIM: Probability and Statistics},
     pages = {164--183},
     publisher = {EDP-Sciences},
     volume = {10},
     year = {2006},
     doi = {10.1051/ps:2006004},
     mrnumber = {2218407},
     language = {en},
     url = {http://archive.numdam.org/articles/10.1051/ps:2006004/}
}
TY  - JOUR
AU  - Huet, Sylvie
TI  - Model selection for estimating the non zero components of a gaussian vector
JO  - ESAIM: Probability and Statistics
PY  - 2006
SP  - 164
EP  - 183
VL  - 10
PB  - EDP-Sciences
UR  - http://archive.numdam.org/articles/10.1051/ps:2006004/
DO  - 10.1051/ps:2006004
LA  - en
ID  - PS_2006__10__164_0
ER  - 
%0 Journal Article
%A Huet, Sylvie
%T Model selection for estimating the non zero components of a gaussian vector
%J ESAIM: Probability and Statistics
%D 2006
%P 164-183
%V 10
%I EDP-Sciences
%U http://archive.numdam.org/articles/10.1051/ps:2006004/
%R 10.1051/ps:2006004
%G en
%F PS_2006__10__164_0
Huet, Sylvie. Model selection for estimating the non zero components of a gaussian vector. ESAIM: Probability and Statistics, Tome 10 (2006), pp. 164-183. doi : 10.1051/ps:2006004. http://archive.numdam.org/articles/10.1051/ps:2006004/

[1] F. Abramovich, Y. Benjamini, D. Donoho and I. Johnston, Adapting to unknown sparsity by controlloing the false discovery rate. Technical Report 2000-19, Department of Statistics, Stanford University (2000). | Zbl

[2] H. Akaike, Information theory and an extension of the maximum likelihood principle, in 2nd International Symposium on Information Theory, B.N. Petrov and F. Csaki Eds., Budapest Akademia Kiado (1973) 267-281. | Zbl

[3] H. Akaike, A bayesian analysis of the minimum aic procedure. Ann. Inst. Statist. Math. 30 (1978) 9-14. | Zbl

[4] A. Antoniadis, I. Gijbels and G. Grégoire, Model selection using wavelet decomposition and applications. Biometrika 84 (1997) 751-763. | Zbl

[5] Y. Baraud, S. Huet and B. Laurent, Adaptive tests of qualitative hypotheses. ESAIM: PS 7 (2003) 147-159. | Numdam | Zbl

[6] A. Barron, L. Birgé and P. Massart, Risk bounds for model selection via penalization. Probab. Theory Rel. Fields 113 (1999) 301-413. | Zbl

[7] Y. Benjamini and Y. Hochberg, Controlling the false discovery rate: a practical and powerful approach to multiple testing. J. R. Statist. Soc. B 57 (1995) 289-300. | Zbl

[8] L. Birgé and P. Massart, Gaussian model selection. J. Eur. Math. Soc. (JEMS) 3 (2001) 203-268. | Zbl

[9] L. Birgé and P. Massart, A generalized cp criterion for gaussian model selection. Technical report, Univ. Paris 6, Paris 7, Paris (2001). | MR

[10] B.S. Cirel'Son, I.A. Ibragimov and V.N. Sudakov, Norm of gaussian sample function, in Proceedings of the 3rd Japan-URSS. Symposium on Probability Theory, Berlin, Springer-Verlag. Springer Lect. Notes Math. 550 (1976) 20-41. | Zbl

[11] H.A. David, Order Statistics. Wiley series in Probability and mathematical Statistics. John Wiley and Sons, NY (1981). | MR | Zbl

[12] E.P. Box and R.D. Meyer, An analysis for unreplicated fractional factorials. Technometrics 28 (1986) 11-18. | Zbl

[13] D.P. Foster and R.A. Stine, Adaptive variable selection competes with bayes expert. Technical report, The Wharton School of the University of Pennsylvania, Philadelphia (2002).

[14] S. Huet, Comparison of methods for estimating the non zero components of a gaussian vector. Technical report, INRA, MIA-Jouy, www.inra.fr/miaj/apps/cgi-bin/raptech.cgi (2005).

[15] M.C. Hurvich and C.L. Tsai, Regression and time series model selection in small samples. Biometrika 76 (1989) 297-307. | Zbl

[16] I. Johnston and B. Silverman, Empirical bayes selection of wavelet thresholds. Available from www.stats.ox.ac.uk/ silverma/papers.html (2003). | Zbl

[17] B. Laurent and P. Massart, Adaptive estimation of a quadratic functional by model selection. Ann. Statist. 28 (2000) 1302-1338. | Zbl

[18] R. Nishii, Maximum likelihood principle and model selection when the true model is unspecified. J. Multivariate Anal. 27 (1988) 392-403. | Zbl

[19] P.D. Haaland and M.A. O'Connell, Inference for effect-saturated fractional factorials. Technometrics 37 (1995) 82-93. | Zbl

[20] J. Rissanen, Universal coding, information, prediction and estimation. IEEE Trans. Infor. Theory 30 (1984) 629-636. | Zbl

[21] R.V. Lenth, Quick and easy analysis of unreplicated factorials. Technometrics 31(4) (1989) 469-473.

[22] G. Schwarz, Estimating the dimension of a model. Ann. Statist. 6 (1978) 461-464. | Zbl

Cité par Sources :