Accelerated iterative hard thresholding algorithm for l0 regularized regression problem

In this paper, we propose an accelerated iterative hard thresholding algorithm for solving the l 0 regularized box constrained regression problem. We substantiate that there exists a threshold, if the extrapolation coefficients are chosen below this threshold, the proposed algorithm is equivalent to...

Full description

Saved in:
Bibliographic Details
Published in:Journal of global optimization Vol. 76; no. 4; pp. 819 - 840
Main Authors: Wu, Fan, Bian, Wei
Format: Journal Article
Language:English
Published: New York Springer US 01.04.2020
Springer Nature B.V
Subjects:
ISSN:0925-5001, 1573-2916
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we propose an accelerated iterative hard thresholding algorithm for solving the l 0 regularized box constrained regression problem. We substantiate that there exists a threshold, if the extrapolation coefficients are chosen below this threshold, the proposed algorithm is equivalent to the accelerated proximal gradient algorithm for solving a corresponding constrained convex problem after finite iterations. Under some proper conditions, we get that the sequence generated by the proposed algorithm is convergent to a local minimizer of the l 0 regularized problem, which satisfies a desired lower bound. Moreover, when the data fitting function satisfies the error bound condition, we prove that both the iterate sequence and the corresponding sequence of objective function values are R-linearly convergent. Finally, we use several numerical experiments to verify our theoretical results.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0925-5001
1573-2916
DOI:10.1007/s10898-019-00826-6