solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看209次
A lower bound on the differential entropy of log-concave random vectors with applications. (arXiv:1704.07766v1 [cs.IT])
来源于:arXiv
We derive a lower bound on the differential entropy of a log-concave random
variable $X$ in terms of the $p$-th absolute moment of $X$. The new bound leads
to a reverse entropy power inequality with an explicit constant, and to new
bounds on the rate-distortion function and the channel capacity.
Specifically, we study the rate-distortion function for log-concave sources
and distortion measure $| x - \hat x|^r$, and we establish that the difference
between the rate distortion function and the Shannon lower bound is at most
$\log(2 \sqrt{\pi e}) \approx 2.5$ bits, independently of $r$ and the target
distortion $d$. For mean-square error distortion, the difference is at most
$\log (\sqrt{2 \pi e}) \approx 2$ bits, regardless of $d$. The bounds can be
further strengthened if the source, in addition to being log-concave, is
symmetric. In particular, we establish that for mean-square error distortion,
the difference is at most $\log (\sqrt{\pi e}) \approx 1.5$ bits, regardless of
$d$.
We als 查看全文>>