Convergence and Stability of a Class of Iteratively Re-weighted Least Squares Algorithms for Sparse Signal Recovery in the Presence of Noise

In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algor...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on signal processing Jg. 62; H. 1; S. 183 - 195
Hauptverfasser: Babadi, Behtash, Ba, Demba, Purdon, Patrick L, Brown, Emery N
Format: Journal Article
Sprache:Englisch
Veröffentlicht: United States 30.10.2013
ISSN:1053-587X
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution. The IRLS algorithms we consider are parametrized by 0 < ≤ 1 and > 0. The EM formalism, as well as the connection to GSMs, allow us to establish that the IRLS( , ) algorithms minimize -smooth versions of the ℓ 'norms'. We leverage EM theory to show that, for each 0 < ≤ 1, the limit points of the sequence of IRLS( , ) iterates are stationary point of the -smooth ℓ 'norm' minimization problem on the constraint set. Finally, we employ techniques from Compressive sampling (CS) theory to show that the class of IRLS( , ) algorithms is stable for each 0 < ≤ 1, if the limit point of the iterates coincides the global minimizer. For the case = 1, we show that the algorithm converges exponentially fast to a neighborhood of the stationary point, and outline its generalization to super-exponential convergence for < 1. We demonstrate our claims via simulation experiments. The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery.
AbstractList In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution. The IRLS algorithms we consider are parametrized by 0 < ν ≤ 1 and ε > 0. The EM formalism, as well as the connection to GSMs, allow us to establish that the IRLS(ν, ε) algorithms minimize ε-smooth versions of the ℓ ν 'norms'. We leverage EM theory to show that, for each 0 < ν ≤ 1, the limit points of the sequence of IRLS(ν, ε) iterates are stationary point of the ε-smooth ℓ ν 'norm' minimization problem on the constraint set. Finally, we employ techniques from Compressive sampling (CS) theory to show that the class of IRLS(ν, ε) algorithms is stable for each 0 < ν ≤ 1, if the limit point of the iterates coincides the global minimizer. For the case ν = 1, we show that the algorithm converges exponentially fast to a neighborhood of the stationary point, and outline its generalization to super-exponential convergence for ν < 1. We demonstrate our claims via simulation experiments. The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery.In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution. The IRLS algorithms we consider are parametrized by 0 < ν ≤ 1 and ε > 0. The EM formalism, as well as the connection to GSMs, allow us to establish that the IRLS(ν, ε) algorithms minimize ε-smooth versions of the ℓ ν 'norms'. We leverage EM theory to show that, for each 0 < ν ≤ 1, the limit points of the sequence of IRLS(ν, ε) iterates are stationary point of the ε-smooth ℓ ν 'norm' minimization problem on the constraint set. Finally, we employ techniques from Compressive sampling (CS) theory to show that the class of IRLS(ν, ε) algorithms is stable for each 0 < ν ≤ 1, if the limit point of the iterates coincides the global minimizer. For the case ν = 1, we show that the algorithm converges exponentially fast to a neighborhood of the stationary point, and outline its generalization to super-exponential convergence for ν < 1. We demonstrate our claims via simulation experiments. The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery.
In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution. The IRLS algorithms we consider are parametrized by 0 < ν ≤ 1 and ε > 0. The EM formalism, as well as the connection to GSMs, allow us to establish that the IRLS(ν, ε) algorithms minimize ε-smooth versions of the ℓν ‘norms’. We leverage EM theory to show that, for each 0 < ν ≤ 1, the limit points of the sequence of IRLS(ν, ε) iterates are stationary point of the ε-smooth ℓν ‘norm’ minimization problem on the constraint set. Finally, we employ techniques from Compressive sampling (CS) theory to show that the class of IRLS(ν, ε) algorithms is stable for each 0 < ν ≤ 1, if the limit point of the iterates coincides the global minimizer. For the case ν = 1, we show that the algorithm converges exponentially fast to a neighborhood of the stationary point, and outline its generalization to super-exponential convergence for ν < 1. We demonstrate our claims via simulation experiments. The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery.
In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution. The IRLS algorithms we consider are parametrized by 0 < ≤ 1 and > 0. The EM formalism, as well as the connection to GSMs, allow us to establish that the IRLS( , ) algorithms minimize -smooth versions of the ℓ 'norms'. We leverage EM theory to show that, for each 0 < ≤ 1, the limit points of the sequence of IRLS( , ) iterates are stationary point of the -smooth ℓ 'norm' minimization problem on the constraint set. Finally, we employ techniques from Compressive sampling (CS) theory to show that the class of IRLS( , ) algorithms is stable for each 0 < ≤ 1, if the limit point of the iterates coincides the global minimizer. For the case = 1, we show that the algorithm converges exponentially fast to a neighborhood of the stationary point, and outline its generalization to super-exponential convergence for < 1. We demonstrate our claims via simulation experiments. The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery.
Author Ba, Demba
Purdon, Patrick L
Babadi, Behtash
Brown, Emery N
AuthorAffiliation 2 Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital, Boston, MA 02114
3 Harvard-MIT Division of Health, Sciences and Technology
1 Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139
AuthorAffiliation_xml – name: 3 Harvard-MIT Division of Health, Sciences and Technology
– name: 1 Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139
– name: 2 Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital, Boston, MA 02114
Author_xml – sequence: 1
  givenname: Behtash
  surname: Babadi
  fullname: Babadi, Behtash
  organization: Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139 ; Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital, Boston, MA 02114
– sequence: 2
  givenname: Demba
  surname: Ba
  fullname: Ba, Demba
  organization: Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139 ; Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital, Boston, MA 02114
– sequence: 3
  givenname: Patrick L
  surname: Purdon
  fullname: Purdon, Patrick L
  organization: Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139 ; Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital, Boston, MA 02114
– sequence: 4
  givenname: Emery N
  surname: Brown
  fullname: Brown, Emery N
  organization: Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139 ; Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital, Boston, MA 02114 ; Harvard-MIT Division of Health, Sciences and Technology
BackLink https://www.ncbi.nlm.nih.gov/pubmed/26549965$$D View this record in MEDLINE/PubMed
BookMark eNpVkT1v2zAQhjkkaL66dyo4dpFDiiJFLQUCI2kDGElQuUA34SSdZAY06ZC0C_-H_ugorRsk0x3w3vs8w52RI-cdEvKJsxnnrLpc1g-znHExy3NdKi2PyClnUmRSl79OyFmMj4zxoqjUB3KSK1lUlZKn5M_cux2GEV2HFFxP6wStsSbtqR8o0LmFGF_W24QBktmh3dMfmP1GM64S9nSBEBOtn7YQMNIrO_pg0mod6eADrTcQItLajA7sVOv85NpT42haIX2YGn-9E_7Om4gX5HgAG_HjYZ6TnzfXy_n3bHH_7XZ-tcg2nPM840XPtCw5w5IJhKIrxSD6inWDHBjruNAVlwVrdauxUAKmlHUqxx7aCvWgxDn5-o-72bZr7Dt0KYBtNsGsIewbD6Z5nzizaka_ayaaYkU-Ab4cAME_bTGmZm1ih9aCQ7-NDddCKpELVU6nn9-6XiX_XyCeARaSi0M
ContentType Journal Article
DBID NPM
7X8
5PM
DOI 10.1109/TSP.2013.2287685
DatabaseName PubMed
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle PubMed
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic

PubMed
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EndPage 195
ExternalDocumentID PMC4636042
26549965
Genre Journal Article
GrantInformation_xml – fundername: NIH HHS
  grantid: DP1 OD003646
GroupedDBID -~X
.DC
0R~
29I
4.4
5GY
6IK
85S
97E
AAJGR
AASAJ
AAWTH
ABAZT
ABQJQ
ACGFO
ACIWK
ACNCT
AENEX
AGSQL
AHBIQ
AJQPL
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
E.L
EBS
EJD
F5P
HZ~
IFIPE
IPLJI
JAVBF
LAI
MS~
NPM
O9-
OCL
P2P
RIA
RIE
RIG
RNS
TAE
TN5
7X8
ABVLG
AGQYO
AKQYR
5PM
ID FETCH-LOGICAL-p1112-14d085710e703ea4c73f3d90cf5f00c13891540b8b8e463a3f30c62edab9e8f63
ISSN 1053-587X
IngestDate Tue Sep 30 16:36:02 EDT 2025
Sat Sep 27 21:15:02 EDT 2025
Thu Apr 03 07:06:41 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 1
Language English
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-p1112-14d085710e703ea4c73f3d90cf5f00c13891540b8b8e463a3f30c62edab9e8f63
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
PMID 26549965
PQID 1835632367
PQPubID 23479
PageCount 13
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_4636042
proquest_miscellaneous_1835632367
pubmed_primary_26549965
PublicationCentury 2000
PublicationDate 2013-Oct-30
20131030
PublicationDateYYYYMMDD 2013-10-30
PublicationDate_xml – month: 10
  year: 2013
  text: 2013-Oct-30
  day: 30
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle IEEE transactions on signal processing
PublicationTitleAlternate IEEE Trans Signal Process
PublicationYear 2013
SSID ssj0014496
Score 2.081003
Snippet In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the...
SourceID pubmedcentral
proquest
pubmed
SourceType Open Access Repository
Aggregation Database
Index Database
StartPage 183
Title Convergence and Stability of a Class of Iteratively Re-weighted Least Squares Algorithms for Sparse Signal Recovery in the Presence of Noise
URI https://www.ncbi.nlm.nih.gov/pubmed/26549965
https://www.proquest.com/docview/1835632367
https://pubmed.ncbi.nlm.nih.gov/PMC4636042
Volume 62
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  issn: 1053-587X
  databaseCode: RIE
  dateStart: 19910101
  customDbUrl:
  isFulltext: true
  dateEnd: 99991231
  titleUrlDefault: https://ieeexplore.ieee.org/
  omitProxy: false
  ssIdentifier: ssj0014496
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lj9MwELbKwgEOiDddYGUkblFKHm5iH5cKxKFUlVpQb5WTONtou2noY3f7H_hd_C7G4yRNWw6AxCWK4jRRPF_Hntc3hLyTqZBMBtIWSchsFqW-Hck0tXniS-5LJwkxovutHw4GfDIRw1brZ1ULcz0P85zf3oriv4oaroGwdensX4i7fihcgHMQOhxB7HD8I8H3dB750nBsYl7m2lBxb00pJHbBxAIS5FMGZTfX23D7Bn2ksP3s624-1uj7RlcmWefzi8UyW88Mb4M1KsAQVtYou8iRlT_WGaDbKllyiLVMsXFDLLLVXpqRNit1R4qqPTnGKVbmQYUpV6iWUfSsRjLBTIMParaWq9luwKjJq6heUIaa5jw3-2HdcODS6neO3Aza-7a1Bp2mm8PFhLkyYmM0M2gLu8vDSVN1B94RRI0edk13nHJJd00fz-PVAslWx6OhTvHzOx5YjwHfuxVEW1whULxAW9Kmr8UBQ_fwSw9p1xhsBe56YVe4po6wjmYxhn3i6k-owuWOeH_4ck1OXb7pdzbPYepuYy80fkQelkYMPTfge0xaKn9CHjSoLZ-SHw0YUoAhrWFIFymVFGGoTxswpA0YUoQhLWFIdzCkAENqYEgNDGkFQ5rlFGBIKxjqxyMMn5Gvnz6Oe5_tsvOHXcDa69kuS3TnBddRsCApyeLQT_1EOHHaTR0nxuA6mBoRj7iCqZcw6sSBpxIZCcXTwH9OTvJFrl4S6ntCCsFVksQR85iMPCfgIY_A9OZMJV6bvK1meQqaVYfLZK4Wm9UUMNQNfM1w2CYvzKxPC0MBM61k1CbhnjzqGzRr-_5Ins2Qvb2Eyuk___IVub_7g7wmJ-vlRr0h9-LrdbZanpE74YSfIf5-AX89w4s
linkProvider IEEE
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Convergence+and+Stability+of+a+Class+of+Iteratively+Re-weighted+Least+Squares+Algorithms+for+Sparse+Signal+Recovery+in+the+Presence+of+Noise&rft.jtitle=IEEE+transactions+on+signal+processing&rft.au=Babadi%2C+Behtash&rft.au=Ba%2C+Demba&rft.au=Purdon%2C+Patrick+L.&rft.au=Brown%2C+Emery+N.&rft.date=2013-10-30&rft.issn=1053-587X&rft.volume=62&rft.issue=1&rft.spage=183&rft.epage=195&rft_id=info:doi/10.1109%2FTSP.2013.2287685&rft_id=info%3Apmid%2F26549965&rft.externalDocID=PMC4636042
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1053-587X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1053-587X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1053-587X&client=summon