solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看103次
Linear Bounds between Contraction Coefficients for $f$-Divergences. (arXiv:1510.01844v4 [cs.IT] UPDATED)
来源于:arXiv
Data processing inequalities for $f$-divergences can be sharpened using
constants called "contraction coefficients" to produce strong data processing
inequalities. For any discrete source-channel pair, the contraction
coefficients for $f$-divergences are lower bounded by the contraction
coefficient for $\chi^2$-divergence. In this paper, we elucidate that this
lower bound can be achieved by driving the input $f$-divergences of the
contraction coefficients to zero. Then, we establish a linear upper bound on
the contraction coefficients for a certain class of $f$-divergences using the
contraction coefficient for $\chi^2$-divergence, and refine this upper bound
for the salient special case of Kullback-Leibler (KL) divergence. Furthermore,
we present an alternative proof of the fact that the contraction coefficients
for KL and $\chi^2$-divergences are equal for a Gaussian source with an
additive Gaussian noise channel (where the former coefficient can be power
constrained). Finally, we gen 查看全文>>