solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看2866次
$\alpha$-Variational Inference with Statistical Guarantees. (arXiv:1710.03266v1 [math.ST])
来源于:arXiv
We propose a variational approximation to Bayesian posterior distributions,
called $\alpha$-VB, with provable statistical guarantees for models with and
without latent variables. The standard variational approximation is a special
case of $\alpha$-VB with $\alpha=1$. When $\alpha \in(0,1)$, a novel class of
variational inequalities are developed for linking the Bayes risk under the
variational approximation to the objective function in the variational
optimization problem, implying that maximizing the evidence lower bound in
variational inference has the effect of minimizing the Bayes risk within the
variational density family. Operating in a frequentist setup, the variational
inequalities imply that point estimates constructed from the $\alpha$-VB
procedure converge at an optimal rate to the true parameter in a wide range of
problems. We illustrate our general theory with a number of examples, including
the mean-field variational approximation to (low)-high-dimensional Bayesian
linear 查看全文>>