A Generalized Model for Robust Tensor Factorization With Noise Modeling by Mixture of Gaussians
The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery...
Saved in:
| Published in: | IEEE transaction on neural networks and learning systems Vol. 29; no. 11; pp. 5380 - 5393 |
|---|---|
| Main Authors: | , , , , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
United States
IEEE
01.11.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects: | |
| ISSN: | 2162-237X, 2162-2388, 2162-2388 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, <inline-formula> <tex-math notation="LaTeX">L_{1} </tex-math></inline-formula>-norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods. |
|---|---|
| AbstractList | The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, <inline-formula> <tex-math notation="LaTeX">L_{1} </tex-math></inline-formula>-norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods. The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, -norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods. The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, -norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods.The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, -norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods. The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, L1-norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods. |
| Author | Han, Zhi Wang, Yao Meng, Deyu Lin, Lin Tang, Yandong Zhao, Qian Chen, Xi'ai |
| Author_xml | – sequence: 1 givenname: Xi'ai orcidid: 0000-0003-4756-3962 surname: Chen fullname: Chen, Xi'ai email: chenxiai@sia.cn organization: State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China – sequence: 2 givenname: Zhi orcidid: 0000-0002-8039-6679 surname: Han fullname: Han, Zhi email: hanzhi@sia.cn organization: State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China – sequence: 3 givenname: Yao orcidid: 0000-0003-4207-5273 surname: Wang fullname: Wang, Yao email: yao.s.wang@gmail.com organization: School of Mathematics and Statistics, Xi'an Jiaotong University, Xi'an, China – sequence: 4 givenname: Qian orcidid: 0000-0001-9956-0064 surname: Zhao fullname: Zhao, Qian email: timmy.zhaoqian@gmail.com organization: School of Mathematics and Statistics, Xi'an Jiaotong University, Xi'an, China – sequence: 5 givenname: Deyu orcidid: 0000-0002-1294-8283 surname: Meng fullname: Meng, Deyu email: dymeng@mail.xjtu.edu.cn organization: School of Mathematics and Statistics, Xi'an Jiaotong University, Xi'an, China – sequence: 6 givenname: Lin orcidid: 0000-0002-0793-5795 surname: Lin fullname: Lin, Lin email: 610674737@qq.com organization: School of Mathematics and Statistics, Xi'an Jiaotong University, Xi'an, China – sequence: 7 givenname: Yandong orcidid: 0000-0003-3805-7654 surname: Tang fullname: Tang, Yandong email: ytang@sia.cn organization: State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/29994738$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kUFrFDEYhoNUbK39AwoS8OJl1-TLbCY5lmJXYbuCrugtJJlvNGU2qUkGbH-9s91tDz2YSxJ4n5eP73lJjmKKSMhrzuacM_1hs16vvs2BcTWHVkvJ5DNyAlzCDIRSR4_v9ucxOSvlmk1HsoVs9AtyDFrrphXqhJhzusSI2Q7hDjt6lTocaJ8y_ZrcWCrdYCzT79L6mnK4szWkSH-E-puuUyi4B0L8Rd0tvQp_65iRpp4u7VhKsLG8Is97OxQ8O9yn5Pvlx83Fp9nqy_Lzxflq5sWC1xl657wV0GtkvWrRK74QFqBpwTvOOs8a6WTTSYdWOi4FoLes443STCx6LU7J-33vTU5_RizVbEPxOAw2YhqLASaVaKZSOUXfPYlepzHHaToDHLiWCppd4dtDanRb7MxNDlubb83D6qaA2gd8TqVk7I0P9X4_NdswGM7MTpS5F2V2osxB1ITCE_Sh_b_Qmz0UEPERUGJyClL8A3SCnco |
| CODEN | ITNNAL |
| CitedBy_id | crossref_primary_10_1109_TGRS_2023_3277848 crossref_primary_10_3390_rs15081970 crossref_primary_10_1109_TGRS_2021_3085779 crossref_primary_10_1109_TGRS_2023_3237865 crossref_primary_10_1109_TNNLS_2023_3280086 crossref_primary_10_12677_AAM_2022_1112896 crossref_primary_10_1109_TNNLS_2019_2956926 crossref_primary_10_1109_JSTARS_2020_3046488 crossref_primary_10_1109_TNNLS_2020_3009210 crossref_primary_10_1109_TGRS_2022_3217051 crossref_primary_10_1109_TFUZZ_2023_3291488 crossref_primary_10_1109_TNNLS_2021_3112577 crossref_primary_10_1109_TNNLS_2023_3248156 crossref_primary_10_1016_j_ins_2023_02_012 crossref_primary_10_1016_j_neucom_2021_06_020 crossref_primary_10_1016_j_sigpro_2020_107889 crossref_primary_10_1016_j_sigpro_2020_107527 crossref_primary_10_1109_TCSVT_2024_3442295 crossref_primary_10_1109_TNNLS_2023_3266841 crossref_primary_10_1109_TNNLS_2022_3214307 crossref_primary_10_1109_TNNLS_2021_3106654 crossref_primary_10_1109_TCI_2021_3126232 crossref_primary_10_1109_TNNLS_2019_2929063 crossref_primary_10_3390_rs14184470 crossref_primary_10_1109_TNNLS_2021_3104837 crossref_primary_10_1016_j_ins_2019_06_061 crossref_primary_10_1109_TNNLS_2021_3083931 crossref_primary_10_1016_j_patcog_2024_110995 crossref_primary_10_1016_j_patrec_2022_10_005 crossref_primary_10_1007_s11042_023_16561_w crossref_primary_10_1109_TCYB_2021_3067676 crossref_primary_10_1016_j_jfranklin_2023_07_024 crossref_primary_10_1109_TCYB_2022_3169800 crossref_primary_10_1109_TNNLS_2018_2873655 crossref_primary_10_1109_TPAMI_2019_2923240 crossref_primary_10_1109_TNNLS_2023_3293328 crossref_primary_10_1109_TGRS_2022_3227735 crossref_primary_10_1016_j_sigpro_2024_109407 crossref_primary_10_1016_j_patcog_2020_107310 crossref_primary_10_1109_ACCESS_2020_3024635 crossref_primary_10_1109_TNNLS_2018_2885616 crossref_primary_10_1109_TNNLS_2019_2956153 crossref_primary_10_1007_s10489_024_05899_9 crossref_primary_10_1109_LSP_2020_2991581 crossref_primary_10_1016_j_neunet_2022_05_023 crossref_primary_10_1109_TGRS_2024_3379199 crossref_primary_10_3390_rs12081278 crossref_primary_10_1109_TNNLS_2019_2957527 crossref_primary_10_1109_TNNLS_2022_3172184 |
| Cites_doi | 10.1007/BF02310791 10.1109/TNNLS.2015.2465178 10.1109/CVPR.1991.139758 10.1109/ISIT.2010.5513535 10.1109/CVPR.2003.1211457 10.1109/LGRS.2008.915736 10.1007/s11432-015-5419-2 10.1109/TIP.2010.2046811 10.1145/1970392.1970395 10.1007/s11263-007-0099-z 10.1137/140997816 10.1364/JOSAA.4.000519 10.1111/j.2517-6161.1977.tb01600.x 10.1109/ICCV.2015.39 10.1109/TPAMI.2012.39 10.1109/CVPR.2014.377 10.1002/mrm.1910350312 10.1109/TIP.2007.899598 10.1109/TPAMI.2015.2392756 10.1002/sapm192761164 10.1109/34.598228 10.1109/TIP.2011.2109730 10.1214/07-AOAS131 10.1007/BF00129684 10.1109/WQV.1993.262951 10.1016/j.patcog.2006.08.004 10.1109/34.927464 10.1007/3-540-47969-4_30 10.1109/ICCV.2013.169 10.1145/1073204.1073209 10.1007/s11432-014-5223-4 10.1007/BF02289464 10.1016/j.laa.2004.01.016 10.1109/TIP.2004.836169 10.1109/ICCV.2003.1238452 10.1109/CVPR.2016.563 10.1109/CDC.2013.6760155 10.1002/1099-128X(200005/06)14:3<105::AID-CEM582>3.0.CO;2-I 10.1109/TGRS.2012.2187063 10.1016/0024-3795(77)90069-6 10.1016/j.neucom.2016.10.030 10.1137/1.9781611972801.19 10.1016/j.sigpro.2004.11.029 10.18637/jss.v033.i01 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018 |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
| DOI | 10.1109/TNNLS.2018.2796606 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Xplore CrossRef PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Calcium & Calcified Tissue Abstracts Ceramic Abstracts Chemoreception Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Neurosciences Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
| DatabaseTitle | CrossRef PubMed Materials Research Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Materials Business File Aerospace Database Engineered Materials Abstracts Biotechnology Research Abstracts Chemoreception Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Civil Engineering Abstracts Aluminium Industry Abstracts Electronics & Communications Abstracts Ceramic Abstracts Neurosciences Abstracts METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Solid State and Superconductivity Abstracts Engineering Research Database Calcium & Calcified Tissue Abstracts Corrosion Abstracts MEDLINE - Academic |
| DatabaseTitleList | PubMed MEDLINE - Academic Materials Research Database |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 2162-2388 |
| EndPage | 5393 |
| ExternalDocumentID | 29994738 10_1109_TNNLS_2018_2796606 8305626 |
| Genre | orig-research Research Support, Non-U.S. Gov't Journal Article |
| GrantInformation_xml | – fundername: Key Research Program of Hunan Province, China grantid: 2017GK2273 – fundername: China Postdoctoral Science Foundation grantid: 2017M610628 funderid: 10.13039/501100002858 – fundername: Youth Innovation Promotion Association of the Chinese Academy of Sciences grantid: 2016183 funderid: 10.13039/501100004739 – fundername: National Natural Science Foundation of China grantid: 61773367; 61303168; 11501440; 61333019; 61373114 funderid: 10.13039/501100001809 |
| GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK ACPRK AENEX AFRAH AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD IFIPE IPLJI JAVBF M43 MS~ O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION NPM RIG 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
| ID | FETCH-LOGICAL-c351t-ecbbca32f9e0f87ec8153a22472cb10dc046b64d6bea6b1632eca0d1489035f93 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 54 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000447832200017&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 2162-237X 2162-2388 |
| IngestDate | Thu Oct 02 07:08:01 EDT 2025 Sun Sep 07 03:47:36 EDT 2025 Thu Jan 02 22:59:41 EST 2025 Tue Nov 18 21:32:39 EST 2025 Sat Nov 29 01:39:59 EST 2025 Wed Aug 27 02:52:48 EDT 2025 |
| IsPeerReviewed | false |
| IsScholarly | true |
| Issue | 11 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c351t-ecbbca32f9e0f87ec8153a22472cb10dc046b64d6bea6b1632eca0d1489035f93 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0003-4756-3962 0000-0003-4207-5273 0000-0002-0793-5795 0000-0003-3805-7654 0000-0001-9956-0064 0000-0002-8039-6679 0000-0002-1294-8283 |
| PMID | 29994738 |
| PQID | 2121968249 |
| PQPubID | 85436 |
| PageCount | 14 |
| ParticipantIDs | pubmed_primary_29994738 proquest_miscellaneous_2068341536 proquest_journals_2121968249 crossref_citationtrail_10_1109_TNNLS_2018_2796606 crossref_primary_10_1109_TNNLS_2018_2796606 ieee_primary_8305626 |
| PublicationCentury | 2000 |
| PublicationDate | 2018-11-01 |
| PublicationDateYYYYMMDD | 2018-11-01 |
| PublicationDate_xml | – month: 11 year: 2018 text: 2018-11-01 day: 01 |
| PublicationDecade | 2010 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: Piscataway |
| PublicationTitle | IEEE transaction on neural networks and learning systems |
| PublicationTitleAbbrev | TNNLS |
| PublicationTitleAlternate | IEEE Trans Neural Netw Learn Syst |
| PublicationYear | 2018 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref12 ref15 ref53 ref52 ref55 ref10 ref17 ref19 ref18 huang (ref29) 2008 zhao (ref35) 2014 ref51 ref50 ref46 wright (ref1) 2009 ref45 ref48 ref47 ref42 ref41 harshman (ref11) 1970; 16 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 mei (ref54) 2017 wainwright (ref39) 2000 ref34 ref37 ref36 ref31 ref33 ref2 ref38 bader (ref44) 2015 friedman (ref43) 2010; 33 ref24 ref23 ref26 ref25 ref20 ref22 ref21 chi (ref30) 2010 ref28 ref27 savas (ref16) 2003 gu (ref32) 2014 rai (ref14) 2014 |
| References_xml | – start-page: 1 year: 2014 ident: ref35 article-title: Robust principal component analysis with complex noise publication-title: Proc Int Conf Mach Learn – ident: ref10 doi: 10.1007/BF02310791 – ident: ref53 doi: 10.1109/TNNLS.2015.2465178 – ident: ref3 doi: 10.1109/CVPR.1991.139758 – ident: ref31 doi: 10.1109/ISIT.2010.5513535 – start-page: 1 year: 2008 ident: ref29 article-title: Robust tensor factorization using R1 norm publication-title: Proc IEEE Conf Comput Vis Pattern Recognit (CVPR) – ident: ref24 doi: 10.1109/CVPR.2003.1211457 – volume: 16 start-page: 1 year: 1970 ident: ref11 article-title: Foundations of the parafac procedure: Models and conditions for an 'explanatory' multimodal factor analysis publication-title: UCLA Working Papers Phonetics – start-page: 1800 year: 2014 ident: ref14 article-title: Scalable Bayesian low-rank decomposition of incomplete multiway tensors publication-title: Proc Int Conf Mach Learn – ident: ref46 doi: 10.1109/LGRS.2008.915736 – ident: ref33 doi: 10.1007/s11432-015-5419-2 – ident: ref50 doi: 10.1109/TIP.2010.2046811 – ident: ref8 doi: 10.1145/1970392.1970395 – ident: ref7 doi: 10.1007/s11263-007-0099-z – ident: ref55 doi: 10.1137/140997816 – ident: ref2 doi: 10.1364/JOSAA.4.000519 – ident: ref40 doi: 10.1111/j.2517-6161.1977.tb01600.x – ident: ref36 doi: 10.1109/ICCV.2015.39 – ident: ref9 doi: 10.1109/TPAMI.2012.39 – ident: ref47 doi: 10.1109/CVPR.2014.377 – ident: ref27 doi: 10.1002/mrm.1910350312 – ident: ref38 doi: 10.1109/TIP.2007.899598 – ident: ref13 doi: 10.1109/TPAMI.2015.2392756 – ident: ref18 doi: 10.1002/sapm192761164 – ident: ref4 doi: 10.1109/34.598228 – ident: ref48 doi: 10.1109/TIP.2011.2109730 – start-page: 2080 year: 2009 ident: ref1 article-title: Robust principal component analysis: Exact recovery of corrupted low-rank matrices via convex optimization publication-title: Proc Adv Neural Inf Process Syst – ident: ref42 doi: 10.1214/07-AOAS131 – ident: ref5 doi: 10.1007/BF00129684 – start-page: 1 year: 2017 ident: ref54 article-title: Cauchy noise removal by nonconvex ADMM with convergence guarantees publication-title: J Sci Comput – ident: ref6 doi: 10.1109/WQV.1993.262951 – ident: ref22 doi: 10.1016/j.patcog.2006.08.004 – ident: ref49 doi: 10.1109/34.927464 – year: 2010 ident: ref30 publication-title: Making Tensor Factorizations Robust to Non-Gaussian Noise – ident: ref23 doi: 10.1007/3-540-47969-4_30 – ident: ref34 doi: 10.1109/ICCV.2013.169 – ident: ref25 doi: 10.1145/1073204.1073209 – ident: ref41 doi: 10.1007/s11432-014-5223-4 – ident: ref19 doi: 10.1007/BF02289464 – ident: ref20 doi: 10.1016/j.laa.2004.01.016 – ident: ref28 doi: 10.1109/TIP.2004.836169 – ident: ref26 doi: 10.1109/ICCV.2003.1238452 – ident: ref37 doi: 10.1109/CVPR.2016.563 – year: 2015 ident: ref44 publication-title: MATLAB Tensor Toolbox Version 2 6 – ident: ref51 doi: 10.1109/CDC.2013.6760155 – start-page: 855 year: 2000 ident: ref39 article-title: Scale mixtures of Gaussians and the statistics of natural images publication-title: Proc Adv Neural Inf Process Syst – ident: ref12 doi: 10.1002/1099-128X(200005/06)14:3<105::AID-CEM582>3.0.CO;2-I – ident: ref45 doi: 10.1109/TGRS.2012.2187063 – year: 2003 ident: ref16 article-title: Analyses and tests of handwritten digit recognition algorithms – start-page: 1422 year: 2014 ident: ref32 article-title: Robust tensor decomposition with gross corruption publication-title: Proc Adv Neural Inf Process Syst – ident: ref17 doi: 10.1016/0024-3795(77)90069-6 – ident: ref52 doi: 10.1016/j.neucom.2016.10.030 – ident: ref15 doi: 10.1137/1.9781611972801.19 – ident: ref21 doi: 10.1016/j.sigpro.2004.11.029 – volume: 33 start-page: 1 year: 2010 ident: ref43 article-title: Regularization paths for generalized linear models via coordinate descent publication-title: J Statist Softw doi: 10.18637/jss.v033.i01 |
| SSID | ssj0000605649 |
| Score | 2.485158 |
| Snippet | The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 5380 |
| SubjectTerms | Computational modeling Computer vision Data processing Expectation–maximization (EM) algorithm Factorization Gaussian distribution generalized weighted low-rank tensor factorization (GWLRTF) Indexes Learning systems Least squares mixture of Gaussians (MoG) model Modelling Noise Noise pollution Normal distribution Oligodendrocyte-myelin glycoprotein Optimization Outliers (statistics) Robustness Tensile stress tensor factorization |
| Title | A Generalized Model for Robust Tensor Factorization With Noise Modeling by Mixture of Gaussians |
| URI | https://ieeexplore.ieee.org/document/8305626 https://www.ncbi.nlm.nih.gov/pubmed/29994738 https://www.proquest.com/docview/2121968249 https://www.proquest.com/docview/2068341536 |
| Volume | 29 |
| WOSCitedRecordID | wos000447832200017&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2162-2388 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000605649 issn: 2162-237X databaseCode: RIE dateStart: 20120101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3Ni9UwEB92Fw9eXHX9eLouEbxpd9OmzcdxEZ8etIg-8d1KkiZYWF5l24r61ztJ8wqCCt5amkxDZibzmyQzA_DMKcuoaU2mnWqz0paoUoLKzKhWUqVtTvVcbELUtdxu1fsDeLHEwjjn4uUzdx4e41l-29spbJVdyIB3C34Ih0LwOVZr2U-hiMt5RLtFzousYGK7j5Gh6mJT128_hotc8rwQISEl_80OxcIqf8eY0dasj_9vlLfhVsKU5HIWgjtw4HZ34Xhfr4Ek9T2B5pKkLNPdT9eSUAbtiiBoJR96Mw0j2aBLi2_rWIInxWeSz934hdR9N7i5A5o6Yn6Qd933cPZAek9e62kIoZjDPfi0frV5-SZLBRYyy6p8zJw1xmpWeOWol8JZieufRqMuCmty2lp0ng0vW26c5gaRW-Gspi16UIqyyit2H452_c49BFJWxkvGfJtbV_oql15TpFcpi3S98CvI99Pd2JR9PBTBuGqiF0JVE1nUBBY1iUUreL70-Trn3vhn65PAi6VlYsMKTvdcbZJ6Dg3aa1x5JLqeK3i6fEbFCqcleuf6CdtQLtHEVwxJPJilYaGNNlyVgslHf_7nY7gZRjaHLJ7C0Xg9uSdww34bu-H6DKV3K8-i9P4CYg_rjw |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3di9QwEB_OU9AXTz0_Vk-N4Jv2Lm36kTwe4nriXhFdcd9KPrFwbOXaivrXO0mzBUEF31qaTEMn099vkswMwHMrNKPKqERaYZJc52hSFeWJEoZTIXVK5VRsoqprvtmI93vwco6FsdaGw2f22F-GvXzT6dEvlZ1wz3ez8gpcLfI8o1O01ryiQpGZl4HvZmmZJRmrNrsoGSpO1nW9-uiPcvHjrPIpKcvfkCiUVvk7ywxoszz4v3HegpuRVZLTaRrchj27vQMHu4oNJBrwITSnJOaZbn9aQ3whtAuCtJV86NTYD2SNTi3eLUMRnhihST63wxdSd21vpw4IdkT9IOftd7_7QDpH3six98GY_V34tHy9fnWWxBILiWZFOiRWK6Uly5yw1PHKao5_QImwXmVapdRodJ9VmZtSWVkq5G6Z1ZIa9KEEZYUT7B7sb7utfQAkL5TjjDmTapu7IuVOUpRXCI1yXeUWkO4-d6Nj_nFfBuOiCX4IFU1QUeNV1EQVLeDF3OfrlH3jn60PvS7mllENCzjaabWJBto3iNj47-HofC7g2fwYTcvvl8it7UZsQ0uOIF8wFHF_mg2zbERxkVeMP_zzO5_C9bP1-apZva3fPYIbfpRTAOMR7A-Xo30M1_S3oe0vn4Q5_As4c-3u |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Generalized+Model+for+Robust+Tensor+Factorization+With+Noise+Modeling+by+Mixture+of+Gaussians&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Chen%2C+Xi%27ai&rft.au=Han%2C+Zhi&rft.au=Wang%2C+Yao&rft.au=Zhao%2C+Qian&rft.date=2018-11-01&rft.issn=2162-2388&rft.eissn=2162-2388&rft.volume=29&rft.issue=11&rft.spage=5380&rft_id=info:doi/10.1109%2FTNNLS.2018.2796606&rft.externalDBID=NO_FULL_TEXT |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon |