Relations Among Some Low-Rank Subspace Recovery Models

Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work has modeled subspace recovery as low-rank minimization problems. We find that some representative models, such as robust principal component a...

Full description

Saved in:
Bibliographic Details
Published in:Neural computation Vol. 27; no. 9; p. 1915
Main Authors: Zhang, Hongyang, Lin, Zhouchen, Zhang, Chao, Gao, Junbin
Format: Journal Article
Language:English
Published: United States 01.09.2015
ISSN:1530-888X, 1530-888X
Online Access:Get more information
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work has modeled subspace recovery as low-rank minimization problems. We find that some representative models, such as robust principal component analysis (R-PCA), robust low-rank representation (R-LRR), and robust latent low-rank representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once a solution to one of the models is obtained, we can obtain the solutions to other models in closed-form formulations. Since R-PCA is the simplest, our discovery makes it the center of low-rank subspace recovery models. Our work has two important implications. First, R-PCA has a solid theoretical foundation. Under certain conditions, we could find globally optimal solutions to these low-rank models at an overwhelming probability, although these models are nonconvex. Second, we can obtain significantly faster algorithms for these models by solving R-PCA first. The computation cost can be further cut by applying low-complexity randomized algorithms, for example, our novel l2,1 filtering algorithm, to R-PCA. Although for the moment the formal proof of our l2,1 filtering algorithm is not yet available, experiments verify the advantages of our algorithm over other state-of-the-art methods based on the alternating direction method.
AbstractList Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work has modeled subspace recovery as low-rank minimization problems. We find that some representative models, such as robust principal component analysis (R-PCA), robust low-rank representation (R-LRR), and robust latent low-rank representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once a solution to one of the models is obtained, we can obtain the solutions to other models in closed-form formulations. Since R-PCA is the simplest, our discovery makes it the center of low-rank subspace recovery models. Our work has two important implications. First, R-PCA has a solid theoretical foundation. Under certain conditions, we could find globally optimal solutions to these low-rank models at an overwhelming probability, although these models are nonconvex. Second, we can obtain significantly faster algorithms for these models by solving R-PCA first. The computation cost can be further cut by applying low-complexity randomized algorithms, for example, our novel l2,1 filtering algorithm, to R-PCA. Although for the moment the formal proof of our l2,1 filtering algorithm is not yet available, experiments verify the advantages of our algorithm over other state-of-the-art methods based on the alternating direction method.
Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work has modeled subspace recovery as low-rank minimization problems. We find that some representative models, such as robust principal component analysis (R-PCA), robust low-rank representation (R-LRR), and robust latent low-rank representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once a solution to one of the models is obtained, we can obtain the solutions to other models in closed-form formulations. Since R-PCA is the simplest, our discovery makes it the center of low-rank subspace recovery models. Our work has two important implications. First, R-PCA has a solid theoretical foundation. Under certain conditions, we could find globally optimal solutions to these low-rank models at an overwhelming probability, although these models are nonconvex. Second, we can obtain significantly faster algorithms for these models by solving R-PCA first. The computation cost can be further cut by applying low-complexity randomized algorithms, for example, our novel l2,1 filtering algorithm, to R-PCA. Although for the moment the formal proof of our l2,1 filtering algorithm is not yet available, experiments verify the advantages of our algorithm over other state-of-the-art methods based on the alternating direction method.Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work has modeled subspace recovery as low-rank minimization problems. We find that some representative models, such as robust principal component analysis (R-PCA), robust low-rank representation (R-LRR), and robust latent low-rank representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once a solution to one of the models is obtained, we can obtain the solutions to other models in closed-form formulations. Since R-PCA is the simplest, our discovery makes it the center of low-rank subspace recovery models. Our work has two important implications. First, R-PCA has a solid theoretical foundation. Under certain conditions, we could find globally optimal solutions to these low-rank models at an overwhelming probability, although these models are nonconvex. Second, we can obtain significantly faster algorithms for these models by solving R-PCA first. The computation cost can be further cut by applying low-complexity randomized algorithms, for example, our novel l2,1 filtering algorithm, to R-PCA. Although for the moment the formal proof of our l2,1 filtering algorithm is not yet available, experiments verify the advantages of our algorithm over other state-of-the-art methods based on the alternating direction method.
Author Lin, Zhouchen
Zhang, Hongyang
Zhang, Chao
Gao, Junbin
Author_xml – sequence: 1
  givenname: Hongyang
  surname: Zhang
  fullname: Zhang, Hongyang
  email: hy_zh@pku.edu.cn
  organization: Key Laboratory of Machine Perception, School of Electronics Engineering and Computer Science, Peking University, Beijing 100871, China, and Cooperative Medianet Center, Shanghai Jiaotong University, Shanghai 200240, China hy_zh@pku.edu.cn
– sequence: 2
  givenname: Zhouchen
  surname: Lin
  fullname: Lin, Zhouchen
  email: zlin@pku.edu.cn
  organization: Key Laboratory of Machine Perception, School of Electronics Engineering and Computer Science, Peking University, Beijing 100871, China, and Cooperative Medianet Center, Shanghai Jiaotong University, Shanghai 200240, China zlin@pku.edu.cn
– sequence: 3
  givenname: Chao
  surname: Zhang
  fullname: Zhang, Chao
  email: chzhang@cis.pku.edu.cn
  organization: Key Laboratory of Machine Perception, School of Electronics Engineering and Computer Science, Peking University, Beijing 100871, China, and Cooperative Medianet Center, Shanghai Jiaotong University, Shanghai 200240, China chzhang@cis.pku.edu.cn
– sequence: 4
  givenname: Junbin
  surname: Gao
  fullname: Gao, Junbin
  email: jbgao@csu.edu.au
  organization: School of Computing and Mathematics, Charles Sturt University, Bathurst, NSW 2795, Australia jbgao@csu.edu.au
BackLink https://www.ncbi.nlm.nih.gov/pubmed/26161818$$D View this record in MEDLINE/PubMed
BookMark eNpNj89LwzAcxYNM3A-9eZYevVSTNEnT4yibCtVBp-AtpMk3Um2b2bTK_nsHTvD03uHDh_fmaNL5DhC6JPiGEEFvn1b5RmmFcSroCZoRnuBYSvk6-denaB7CO8ZYEMzP0JQKIogkcoZECY0eat-FaNn67i3a-haiwn_Hpe4-ou1YhZ02EJVg_Bf0--jRW2jCOTp1uglwccwFelmvnvP7uNjcPeTLIjaM0SHOrKQ6AwbUCaDgEs2czDJuMp6ABmCEC4slOJsc9jObVkbyilKDQWrHgS7Q9a931_vPEcKg2joYaBrdgR-DIinmaUIYJwf06oiOVQtW7fq61f1e_X2lP1ToVyQ
CitedBy_id crossref_primary_10_1007_s00371_018_1555_1
crossref_primary_10_1007_s40314_022_01871_w
crossref_primary_10_1016_j_neunet_2019_05_007
crossref_primary_10_1186_s12938_020_00778_z
crossref_primary_10_1145_3399806
crossref_primary_10_1016_j_patcog_2016_05_014
crossref_primary_10_3934_bdia_2016001
crossref_primary_10_1109_TIP_2017_2691557
crossref_primary_10_1109_TNNLS_2017_2693221
crossref_primary_10_1016_j_cosrev_2016_11_001
crossref_primary_10_1016_j_postharvbio_2024_113222
crossref_primary_10_1109_TIT_2016_2573311
crossref_primary_10_1109_TNNLS_2015_2436951
crossref_primary_10_1007_s10589_018_0002_6
crossref_primary_10_1109_JPROC_2018_2853589
crossref_primary_10_1162_NECO_a_00951
ContentType Journal Article
DBID NPM
7X8
DOI 10.1162/NECO_a_00762
DatabaseName PubMed
MEDLINE - Academic
DatabaseTitle PubMed
MEDLINE - Academic
DatabaseTitleList PubMed
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod no_fulltext_linktorsrc
Discipline Computer Science
EISSN 1530-888X
ExternalDocumentID 26161818
Genre Research Support, Non-U.S. Gov't
Journal Article
GroupedDBID ---
-~X
.4S
.DC
0R~
123
36B
4.4
41~
53G
6IK
AAFWJ
AAJGR
AALMD
ABAZT
ABDBF
ABDNZ
ABEFU
ABIVO
ABJNI
ABVLG
ACGFO
ACUHS
ACYGS
ADIYS
ADMLS
AEGXH
AEILP
AENEX
AIAGR
ALMA_UNASSIGNED_HOLDINGS
AMVHM
ARCSS
AVWKF
AZFZN
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CAG
COF
CS3
DU5
EAP
EAS
EBC
EBD
EBS
ECS
EDO
EJD
EMB
EMK
EMOBN
EPL
EPS
EST
ESX
F5P
FEDTE
FNEHJ
HVGLF
HZ~
H~9
I-F
IPLJI
JAVBF
MCG
MINIK
MKJ
NPM
O9-
OCL
P2P
PK0
PQQKQ
RMI
SV3
TUS
WG8
WH7
XJE
ZWS
7X8
ID FETCH-LOGICAL-c442t-9d82a9e4e2f6e2ef3a4f8995c953eaee4156d08efd30074d7bc85b22c0e8af5e2
IEDL.DBID 7X8
ISICitedReferencesCount 22
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000360091800005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1530-888X
IngestDate Fri Sep 05 06:43:08 EDT 2025
Mon Jul 21 06:02:21 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 9
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c442t-9d82a9e4e2f6e2ef3a4f8995c953eaee4156d08efd30074d7bc85b22c0e8af5e2
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
PMID 26161818
PQID 1705731451
PQPubID 23479
ParticipantIDs proquest_miscellaneous_1705731451
pubmed_primary_26161818
PublicationCentury 2000
PublicationDate 2015-09-01
PublicationDateYYYYMMDD 2015-09-01
PublicationDate_xml – month: 09
  year: 2015
  text: 2015-09-01
  day: 01
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle Neural computation
PublicationTitleAlternate Neural Comput
PublicationYear 2015
SSID ssj0006105
Score 2.2787623
Snippet Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work...
SourceID proquest
pubmed
SourceType Aggregation Database
Index Database
StartPage 1915
Title Relations Among Some Low-Rank Subspace Recovery Models
URI https://www.ncbi.nlm.nih.gov/pubmed/26161818
https://www.proquest.com/docview/1705731451
Volume 27
WOSCitedRecordID wos000360091800005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LS8QwEA7qevDi-nZ9EcFr2G76Sk6yLCse1rr4oreSxwREbVe7Kv57kz7ckyB46SFQKJOZyfdlpvMhdKYp15QJQ4QNBxLI0CNSc02oNP7AcE-q6q_3h0mcJCxN-bS5cCubtso2J1aJWhfK3ZH33diX2He6suezV-JUo1x1tZHQWEYd30IZF5hxupgWHtUtjDaoPWKZXto2vke0n4xH15nIXCGK_g4uq0Pmovvfz9tA6w28xMPaHzbREuRbqNtKN-AmkrdR9NMEh4dObgjfFi-AJ8UnuRH5E3bpxJJpwI6dWmf_wk4z7bncQfcX47vRJWkkFIgKAjonXDMqOARATQQUjC8CYxlWqHjogwBw9E17DIz2HZjQsVQslJQqD-wGhkB30Upe5LCPsK8iwWTEmFI6iIOQW-imwa4IMxCSqh46bS2TWRd1dQeRQ_FeZgvb9NBebd5sVs_SyCyBiyzIYAd_ePsQrVm4EtYdXkeoY2yAwjFaVR_zx_LtpNp7-0ymV9--w7nO
linkProvider ProQuest
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Relations+Among+Some+Low-Rank+Subspace+Recovery+Models&rft.jtitle=Neural+computation&rft.au=Zhang%2C+Hongyang&rft.au=Lin%2C+Zhouchen&rft.au=Zhang%2C+Chao&rft.au=Gao%2C+Junbin&rft.date=2015-09-01&rft.issn=1530-888X&rft.eissn=1530-888X&rft.volume=27&rft.issue=9&rft.spage=1915&rft_id=info:doi/10.1162%2FNECO_a_00762&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1530-888X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1530-888X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1530-888X&client=summon