Empirical Bayes scaling of Gaussian priors in the white noise model

B.T. Szabó, A.W. Vaart, van der, J.H. Zanten, van

Research output: Contribution to journalArticleAcademicpeer-review

43 Citations (Scopus)
122 Downloads (Pure)


The performance of nonparametric estimators is heavily dependent on a bandwidth parameter. In nonparametric Bayesian methods this parameter can be specified as a hyperparameter of the nonparametric prior. The value of this hyperparameter may be made dependent on the data. The empirical Bayes method is to set its value by maximizing the marginal likelihood of the data in the Bayesian framework. In this paper we analyze a particular version of this method, common in practice, that the hyperparameter scales the prior variance. We characterize the behavior of the random hyperparameter, and show that a nonparametric Bayes method using it gives optimal recovery over a scale of regularity classes. This scale is limited, however, by the regularity of the unscaled prior. While a prior can be scaled up to make it appropriate for arbitrarily rough truths, scaling cannot increase the nominal smoothness by much. Surprisingy the standard empirical Bayes method is even more limited in this respect than an oracle, deterministic scaling method. The same can be said for the hierarchical Bayes method.
Original languageEnglish
Pages (from-to)991-1018
JournalElectronic Journal of Statistics
Publication statusPublished - 2013


Dive into the research topics of 'Empirical Bayes scaling of Gaussian priors in the white noise model'. Together they form a unique fingerprint.

Cite this