Paper Search Console

Home Search Page Alphabetical List About Contact

Journal Title

Title of Journal: J Optim Theory Appl

Search In Journal Title:

Abbravation: Journal of Optimization Theory and Applications

Search In Journal Abbravation:


Springer US

Search In Publisher:



Search In DOI:



Search In ISSN:
Search In Title Of Papers:

Incrementally Updated Gradient Methods for Constrained and Regularized Optimization

Authors: Paul Tseng, Sangwoon Yun,

Publish Date: 2013/09/11
Volume: 160, Issue:3, Pages: 832-853
PDF Link


We consider incrementally updated gradient methods for minimizing the sum of smooth functions and a convex function. This method can use a (sufficiently small) constant stepsize or, more practically, an adaptive stepsize that is decreased whenever sufficient progress is not made. We show that if the gradients of the smooth functions are Lipschitz continuous on the space of n-dimensional real column vectors or the gradients of the smooth functions are bounded and Lipschitz continuous over a certain level set and the convex function is Lipschitz continuous on its domain, then every cluster point of the iterates generated by the method is a stationary point. If in addition a local Lipschitz error bound assumption holds, then the method is linearly convergent.



Search In Abstract Of Papers:
Other Papers In This Journal:

Search Result:

Help video to use 'Paper Search Console'