solidot新版网站常见问题,请点击这里查看。

Agnostic Sample Compression for Linear Regression. (arXiv:1810.01864v1 [cs.LG])

来源于:arXiv
We obtain the first positive results for bounded sample compression in the agnostic regression setting. We show that for p in {1,infinity}, agnostic linear regression with $\ell_p$ loss admits a bounded sample compression scheme. Specifically, we exhibit efficient sample compression schemes for agnostic linear regression in $R^d$ of size $d+1$ under the $\ell_1$ loss and size $d+2$ under the $\ell_\infty$ loss. We further show that for every other $\ell_p$ loss (1 < p < infinity), there does not exist an agnostic compression scheme of bounded size. This refines and generalizes a negative result of David, Moran, and Yehudayoff (2016) for the $\ell_2$ loss. We close by posing a general open question: for agnostic regression with $\ell_1$ loss, does every function class admit a compression scheme of size equal to its pseudo-dimension? This question generalizes Warmuth's classic sample compression conjecture for realizable-case classification (Warmuth, 2003). 查看全文>>