solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看130次
On the Acceleration of L-BFGS with Second-Order Information and Stochastic Batches. (arXiv:1807.05328v1 [cs.LG])
来源于:arXiv
This paper proposes a framework of L-BFGS based on the (approximate)
second-order information with stochastic batches, as a novel approach to the
finite-sum minimization problems. Different from the classical L-BFGS where
stochastic batches lead to instability, we use a smooth estimate for the
evaluations of the gradient differences while achieving acceleration by
well-scaling the initial Hessians. We provide theoretical analyses for both
convex and nonconvex cases. In addition, we demonstrate that within the popular
applications of least-square and cross-entropy losses, the algorithm admits a
simple implementation in the distributed environment. Numerical experiments
support the efficiency of our algorithms. 查看全文>>