Paper Search Console

Home Search Page About Contact

Journal Title

Title of Journal: Comput Optim Appl

Search In Journal Title:

Abbravation: Computational Optimization and Applications

Search In Journal Abbravation:

Publisher

Springer US

Search In Publisher:

DOI

10.1016/0248-4900(91)90116-5

Search In DOI:

ISSN

1573-2894

Search In ISSN:
Search In Title Of Papers:

A coordinate gradient descent method for Emphasis

Authors: Sangwoon Yun KimChuan Toh
Publish Date: 2009/05/07
Volume: 48, Issue: 2, Pages: 273-307
PDF Link

Abstract

In applications such as signal processing and statistics many problems involve finding sparse solutions to underdetermined linear systems of equations These problems can be formulated as a structured nonsmooth optimization problems ie the problem of minimizing ℓ 1regularized linear least squares problems In this paper we propose a block coordinate gradient descent method abbreviated as CGD to solve the more general ℓ 1regularized convex minimization problems ie the problem of minimizing an ℓ 1regularized convex smooth function We establish a Qlinear convergence rate for our method when the coordinate block is chosen by a GaussSouthwelltype rule to ensure sufficient descent We propose efficient implementations of the CGD method and report numerical results for solving largescale ℓ 1regularized linear least squares problems arising in compressed sensing and image deconvolution as well as largescale ℓ 1regularized logistic regression problems for feature selection in data classification Comparison with several stateoftheart algorithms specifically designed for solving largescale ℓ 1regularized linear least squares or logistic regression problems suggests that an efficiently implemented CGD method may outperform these algorithms despite the fact that the CGD method is not specifically designed just to solve these special classes of problems


Keywords:

References


.
Search In Abstract Of Papers:
Other Papers In This Journal:


Search Result: