Les utilisateurs de codes numériques onéreux en temps de calcul souhaitent réduire le coût en limitant le nombre de simulations suivant un choix judicieux fondé sur l’utilisation de plans d’expériences adaptés au contexte numérique et appelés “space-filling designs ”. Afin de remplir au mieux l’espace des variables d’entrée du simulateur, nous proposons une méthode de construction de plans dont les points sont le plus uniformément répartis dans l’hypercube unité. Pour mesurer l’écart entre la fonction de densité associée aux points du plan et celle de la loi uniforme, nous utilisons l’information de Kullback-Leibler, ce qui revient par ailleurs à utiliser l’entropie de Shannon. Celle-ci est estimée par une méthode de Monte Carlo dans laquelle la fonction de densité est remplacée par son estimation par noyaux gaussiens.
Experimental designs are tools which can dramatically reduce the number of simulations required by time-consuming computer codes. Because we don’t know the true relation between the response and inputs, designs should allow one to fit a variety of models and should provide information about all portions of the experimental region. One strategy for selecting the values of the inputs at which to observe the response is to choose these values so they are spread evenly throughout the experimental region, according to “space-filling designs”. In this article, we suggest a new method based on comparing the empirical distribution of the points in a design to the uniform distribution with the Kullback-Leibler information. The considered approach consists of estimating this difference or, reciprocally, the Shannon entropy. The entropy is estimated by a Monte Carlo method where the density function is replaced by its kernel density estimator.
Mot clés : space filling designs, estimation de l’entropie, méthodes à noyaux
Keywords: space filling designs, entropy estimation, kernel density estimation
@article{JSFS_2009__150_2_52_0, author = {Jourdan, Astrid and Franco, Jessica}, title = {Plans d{\textquoteright}exp\'eriences num\'eriques d{\textquoteright}information de {Kullback-Leibler} minimale}, journal = {Journal de la soci\'et\'e fran\c{c}aise de statistique}, pages = {52--64}, publisher = {Soci\'et\'e fran\c{c}aise de statistique}, volume = {150}, number = {2}, year = {2009}, mrnumber = {2609691}, zbl = {1311.62121}, language = {fr}, url = {http://archive.numdam.org/item/JSFS_2009__150_2_52_0/} }
TY - JOUR AU - Jourdan, Astrid AU - Franco, Jessica TI - Plans d’expériences numériques d’information de Kullback-Leibler minimale JO - Journal de la société française de statistique PY - 2009 SP - 52 EP - 64 VL - 150 IS - 2 PB - Société française de statistique UR - http://archive.numdam.org/item/JSFS_2009__150_2_52_0/ LA - fr ID - JSFS_2009__150_2_52_0 ER -
%0 Journal Article %A Jourdan, Astrid %A Franco, Jessica %T Plans d’expériences numériques d’information de Kullback-Leibler minimale %J Journal de la société française de statistique %D 2009 %P 52-64 %V 150 %N 2 %I Société française de statistique %U http://archive.numdam.org/item/JSFS_2009__150_2_52_0/ %G fr %F JSFS_2009__150_2_52_0
Jourdan, Astrid; Franco, Jessica. Plans d’expériences numériques d’information de Kullback-Leibler minimale. Journal de la société française de statistique, Tome 150 (2009) no. 2, pp. 52-64. http://archive.numdam.org/item/JSFS_2009__150_2_52_0/
[1] Nonparametric entropy estimation : an overview., Int. J. Math. Stat. Sci., Volume 6(1) (1997), pp. 17-36 | MR | Zbl
[2] Design of experiments for response diversity., Proc. 6th International Conference on Inverse Problems in Engineering (ICIPE), Volume Journal of Physics : Conference Series, Dourdan (Paris), June (2008)
[3] Distribution estimation consistent in total variation and two types of information divergence, IEEE Trans. Inform. Theory, Volume 5 (1992), pp. 1867-1883 | MR | Zbl
[4] A bayesian approach to the design and analysis of computer experiments. (1988) no. 6498 (Technical report available from the national technical information service, Springfield, Va. 22161)
[5] Bayesian Experimental Design : A review., Statis. Sci., Volume 10 (1995), pp. 237-304 | MR | Zbl
[6] Entropy-Based tests of uniformity., J. Amer. Statist. Assoc., Volume 76 (1981), pp. 967-974 | MR | Zbl
[7] Planification d’expériences numériques en phase exploratoire pour la simulation des phénomènes complexes., Ecole Nationale Supérieure des Mines de Saint-Etienne (2008) (Ph. D. Thesis)
[8] Uniformity measures for point sample in hypercubes., https ://people.scs.fsu.edu/burkardt/pdf/ptmeas.pdf, 2004
[9] Applications of entropic spanning graphs., Signal Processing Magazine, IEEE, Volume 19 (2002), pp. 85-95
[10] Minimax and maximin distance design., J. Statist. Plann. Inf., Volume 26 (1990), pp. 131-148 | MR
[11] Estimation of entropy and other functional of multivariate density., Ann. Int. Statist. Math., Volume 41 (1989), pp. 683-697 | MR | Zbl
[12] On information and sufficiency., Ann. Math. Statist., Volume 22 (1951), pp. 79-86 | MR | Zbl
[13] Sample estimate of entropy of a random vector., Problem of Information Transmission, Volume 23 (1987), pp. 95-101 | Zbl
[14] Computer Experiments., Handbook of statistics, Volume 13 (1996), pp. 261-308 | MR | Zbl
[15] A class of Rényi information estimators for multidimensional densities., Annals of Statistics (2009) (to appear) | Zbl
[16] Point sets and sequences with small discrepancy., Monasth. Math., Volume 104 (1987), pp. 273-337 | MR | Zbl
[17] Optimal Latin Hypercube Designs for Computer Experiments., J. Stat. Plan. Inf., Volume 39 (1994), pp. 95-111 | MR | Zbl
[18] Multivariate Density Estimation : Theory, practice and visualization, John Wiley & Sons, New York, Chichester, 1992 | MR | Zbl
[19] Density estimation for statistics and data analysis., Chapman & Hall, London, 1986 | MR | Zbl
[20] Large sample properties of simulations using latin hypercube sampling., Technometrics, Volume 29 (1987), pp. 143-151 | MR | Zbl
[21] Maximum entropy sampling and optimal Bayesian experimental design., J. Royal Statist. Soc., Volume 62 (2000), pp. 145-157 | MR | Zbl
[22] Maximum Entropy Sampling., J. Appl. Statist., Volume 14 (1987), pp. 165-170
[23] Sur le calcul et la majoration de la discrépance à l’origine., École polytechnique fédérale de Lausanne (2000) (Ph. D. Thesis)