solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看13432次
Finite-time optimality of Bayesian predictors. (arXiv:1812.08292v1 [cs.LG])
来源于:arXiv
The problem of sequential probability forecasting is considered in the most
general setting: a model set C is given, and it is required to predict as well
as possible if any of the measures (environments) in C is chosen to generate
the data. No assumptions whatsoever are made on the model class C, in
particular, no independence or mixing assumptions; C may not be measurable;
there may be no predictor whose loss is sublinear, etc. It is shown that the
cumulative loss of any possible predictor can be matched by that of a Bayesian
predictor whose prior is discrete and is concentrated on C, up to an additive
term of order $\log n$, where $n$ is the time step. The bound holds for every
$n$ and every measure in C. This is the first non-asymptotic result of this
kind. In addition, a non-matching lower bound is established: it goes to
infinity with $n$ but may do so arbitrarily slow. 查看全文>>