The transition kernel of the well-known Metropolis-Hastings (MH) algorithm has a point mass at the chain's current position, which prevent direct smoothness properties to be derived for the successive densities of marginals issued from this algorithm. We show here that under mild smoothness assumption on the MH algorithm “input” densities (the initial, proposal and target distributions), propagation of a Lipschitz condition for the iterative densities can be proved. This allows us to build a consistent nonparametric estimate of the entropy for these iterative densities. This theoretical study can be viewed as a building block for a more general MCMC evaluation tool grounded on such estimates.
Mots-clés : entropy, Kullback divergence, Metropolis-Hastings algorithm, nonparametric statistic
@article{PS_2013__17__419_0, author = {Chauveau, Didier and Vandekerkhove, Pierre}, title = {Smoothness of {Metropolis-Hastings} algorithm and application to entropy estimation}, journal = {ESAIM: Probability and Statistics}, pages = {419--431}, publisher = {EDP-Sciences}, volume = {17}, year = {2013}, doi = {10.1051/ps/2012004}, mrnumber = {3066386}, language = {en}, url = {http://archive.numdam.org/articles/10.1051/ps/2012004/} }
TY - JOUR AU - Chauveau, Didier AU - Vandekerkhove, Pierre TI - Smoothness of Metropolis-Hastings algorithm and application to entropy estimation JO - ESAIM: Probability and Statistics PY - 2013 SP - 419 EP - 431 VL - 17 PB - EDP-Sciences UR - http://archive.numdam.org/articles/10.1051/ps/2012004/ DO - 10.1051/ps/2012004 LA - en ID - PS_2013__17__419_0 ER -
%0 Journal Article %A Chauveau, Didier %A Vandekerkhove, Pierre %T Smoothness of Metropolis-Hastings algorithm and application to entropy estimation %J ESAIM: Probability and Statistics %D 2013 %P 419-431 %V 17 %I EDP-Sciences %U http://archive.numdam.org/articles/10.1051/ps/2012004/ %R 10.1051/ps/2012004 %G en %F PS_2013__17__419_0
Chauveau, Didier; Vandekerkhove, Pierre. Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, Tome 17 (2013), pp. 419-431. doi : 10.1051/ps/2012004. http://archive.numdam.org/articles/10.1051/ps/2012004/
[1] A nonparametric estimation of the entropy for absolutely continuous distributions. IEEE Trans. Inf. Theory 22 (1976) 372-375. | MR | Zbl
and ,[2] A nonparametric estimation of the entropy for absolutely continuous distributions. IEEE Trans. Inf. Theory 36 (1989) 688-692. | Zbl
and ,[3] A tutorial on adaptive MCMC. Stat. Comput. 18 (2008) 343-373. | MR
and ,[4] On adaptive Markov chain Monte Carlo algorithms. Bernoulli 11 (2005) 815-828. | MR | Zbl
and ,[5] Probability and Measure, 3rd edition. Wiley, New York (2005). | Zbl
,[6] Improving convergence of the Hastings-Metropolis algorithm with an adaptive proposal. Scand. J. Stat. 29 (2002) 13-29. | MR | Zbl
and ,[7] A Monte Carlo estimation of the entropy for Markov chains. Methodol. Comput. Appl. Probab. 9 (2007) 133-149. | MR | Zbl
and ,[8] On the estimation of functionals of the probability density and its derivatives. Theory Probab. Appl. 18 (1973) 628-633. | Zbl
and ,[9] On a class of non-parametric estimates of non-linear functionals of density. Theory Probab. Appl. 19 (1973) 390-394. | Zbl
and ,[10] Convergence of adaptive mixtures of importance sampling schemes. Ann. Statist. 35 (2007) 420-448. | MR | Zbl
, , and ,[11] Entropy-based tests of uniformity. J. Amer. Statist. Assoc. 76 (1981) 967-974. | MR | Zbl
and[12] Best asymptotic normality of the Kernel density entropy estimator for Smooth densities. IEEE Trans. Inf. Theory 45 (1999) 1321-1326. | MR | Zbl
and ,[13] Markov Chain Monte Carlo in practice. Chapman & Hall, London (1996) | MR | Zbl
, and ,[14] Adaptive Markov chain Monte carlo through regeneration. J. Amer. Statist. Assoc. 93 (1998) 1045-1054. | MR | Zbl
, and ,[15] Density-free convergence properties of various estimators of the entropy. Comput. Statist. Data Anal. 5 (1987) 425-436. | MR | Zbl
and ,[16] An entropy estimate based on a Kernel density estimation, Limit Theorems in Probability and Statistics Pécs (Hungary). Colloquia Mathematica societatis János Bolyai 57 (1989) 229-240. | Zbl
and ,[17] An adaptive metropolis algorithm. Bernouilli 7 (2001) 223-242. | MR | Zbl
, and ,[18] Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57 (1970) 97-109. | Zbl
,[19] Geometric convergence of the Metropolis-Hastings simulation algorithm. Statist. Probab. Lett. 39 (1998). | MR | Zbl
,[20] Properties of the statistical estimate of the entropy of a random vector with a probability density (in Russian). Probl. Peredachi Inform. 17 (1981) 33-43. Translated into English in Probl. Inf. Transm. 17 (1981) 171-178. | MR | Zbl
and ,[21] Geometric ergodicity of metropolis algorithms. Stoc. Proc. Appl. 85 (2000) 341-361. | MR | Zbl
and ,[22] Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist. 24 (1996) 101-121. | MR | Zbl
and ,[23] Equations of state calculations by fast computing machines. J. Chem. Phys. 21 (1953) 1087-1092.
, , , and ,[24] Estimation of the entropy and information of absolutely continuous random variables. IEEE Trans. Inf. Theory 23 (1989) 95-101. | MR | Zbl
,[25] R Development Core Team, R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org (2010), ISBN 3-900051-07-0.
[26] Optimal scaling for various Metropolis-Hastings algorithms. Statist. Sci. 16 (2001) 351-367. | MR | Zbl
and ,[27] Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms. Biometrika 83 (1996) 95-110. | MR | Zbl
and ,[28] Multivariate Density Estimation: Theory, Practice and Visualization. John Wiley, New York (1992). | MR | Zbl
,[29] On the evaluation of an unknown probability density function, the direct estimation of the entropy from independent observations of a continuous random variable and the distribution-free entropy test of goodness-of-fit. Proc. IEEE 56 (1968) 2052-2053.
,Cité par Sources :