solidot新版网站常见问题,请点击这里查看。

Choose your path wisely: gradient descent in a Bregman distance framework. (arXiv:1712.04045v1 [math.OC])

来源于:arXiv
We propose an extension of a special form of gradient descent --- in the literature known as linearised Bregman iteration --- to a larger class of non-convex functionals. We replace the classical (squared) two norm metric in the gradient descent setting with a generalised Bregman distance, based on a proper, convex and lower semi-continuous functional. The proposed algorithm is a generalisation of numerous well-known optimisation methods. Its global convergence is proven for functions that satisfy the Kurdyka-\L ojasiewicz property. Examples illustrate that for suitable choices of Bregman distances this method --- in contrast to traditional gradient descent --- allows iterating along regular solution-paths. The effectiveness of the linearised Bregman iteration in combination with early stopping is illustrated for the applications of parallel magnetic resonance imaging, blind deconvolution as well as image classification. 查看全文>>