solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看3964次
Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD. (arXiv:1810.04100v1 [math.OC])
来源于:arXiv
We study Stochastic Gradient Descent (SGD) with diminishing step sizes for
convex objective functions. We introduce a definitional framework and theory
that defines and characterizes a core property, called curvature, of convex
objective functions. In terms of curvature we can derive a new inequality that
can be used to compute an optimal sequence of diminishing step sizes by solving
a differential equation. Our exact solutions confirm known results in
literature and allows us to fully characterize a new regularizer with its
corresponding expected convergence rates. 查看全文>>