Journal Title
Title of Journal: Comput Optim Appl
|
Abbravation: Computational Optimization and Applications
|
|
|
|
|
Authors: Sangwoon Yun KimChuan Toh
Publish Date: 2009/05/07
Volume: 48, Issue: 2, Pages: 273-307
Abstract
In applications such as signal processing and statistics many problems involve finding sparse solutions to underdetermined linear systems of equations These problems can be formulated as a structured nonsmooth optimization problems ie the problem of minimizing ℓ 1regularized linear least squares problems In this paper we propose a block coordinate gradient descent method abbreviated as CGD to solve the more general ℓ 1regularized convex minimization problems ie the problem of minimizing an ℓ 1regularized convex smooth function We establish a Qlinear convergence rate for our method when the coordinate block is chosen by a GaussSouthwelltype rule to ensure sufficient descent We propose efficient implementations of the CGD method and report numerical results for solving largescale ℓ 1regularized linear least squares problems arising in compressed sensing and image deconvolution as well as largescale ℓ 1regularized logistic regression problems for feature selection in data classification Comparison with several stateoftheart algorithms specifically designed for solving largescale ℓ 1regularized linear least squares or logistic regression problems suggests that an efficiently implemented CGD method may outperform these algorithms despite the fact that the CGD method is not specifically designed just to solve these special classes of problems
Keywords:
.
|
Other Papers In This Journal:
|