solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看1893次
A New Perspective on Robust $M$-Estimation: Finite Sample Theory and Applications to Dependence-Adjusted Multiple Testing. (arXiv:1711.05381v1 [math.ST])
来源于:arXiv
Heavy-tailed errors impair the accuracy of the least squares estimate, which
can be spoiled by a single grossly outlying observation. As argued in the
seminal work of Peter Huber in 1973 [{\it Ann. Statist.} {\bf 1} (1973)
799--821], robust alternatives to the method of least squares are sorely
needed. To achieve robustness against heavy-tailed sampling distributions, we
revisit the Huber estimator from a new perspective by letting the tuning
parameter involved diverge with the sample size. In this paper, we develop
nonasymptotic concentration results for such an adaptive Huber estimator,
namely, the Huber estimator with the tuning parameter adapted to sample size,
dimension, and the variance of the noise. Specifically, we obtain a
sub-Gaussian-type deviation inequality and a nonasymptotic Bahadur
representation when noise variables only have finite second moments. The
nonasymptotic results further yield two conventional normal approximation
results that are of independent interest, th 查看全文>>