Authors: Paul Tseng Sangwoon Yun
Publish Date: 2013/09/11
Volume: 160, Issue: 3, Pages: 832-853
Abstract
We consider incrementally updated gradient methods for minimizing the sum of smooth functions and a convex function This method can use a sufficiently small constant stepsize or more practically an adaptive stepsize that is decreased whenever sufficient progress is not made We show that if the gradients of the smooth functions are Lipschitz continuous on the space of ndimensional real column vectors or the gradients of the smooth functions are bounded and Lipschitz continuous over a certain level set and the convex function is Lipschitz continuous on its domain then every cluster point of the iterates generated by the method is a stationary point If in addition a local Lipschitz error bound assumption holds then the method is linearly convergent
Keywords: