Learning Latent Features With Infinite Nonnegative Binary Matrix Trifactorization
Nonnegative matrix factorization (NMF) has been widely exploited in many computational intelligence and pattern recognition problems. In particular, it can be used to extract latent features from data. However, previous NMF models often assume a fixed number of features, which are normally tuned and...
Uložené v:
| Vydané v: | IEEE transactions on emerging topics in computational intelligence Ročník 2; číslo 6; s. 450 - 463 |
|---|---|
| Hlavní autori: | , , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Piscataway
IEEE
01.12.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Predmet: | |
| ISSN: | 2471-285X, 2471-285X |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | Nonnegative matrix factorization (NMF) has been widely exploited in many computational intelligence and pattern recognition problems. In particular, it can be used to extract latent features from data. However, previous NMF models often assume a fixed number of features, which are normally tuned and searched using a trial and error approach. Learning binary features is also difficult, since the binary matrix posits a more challenging optimization problem. In this paper, we propose a new Bayesian model, termed the infinite nonnegative binary matrix trifactorization (iNBMT) model. This can automatically learn both latent binary features and feature numbers, based on the Indian buffet process (IBP). It exploits a trifactorization process that decomposes the nonnegative matrix into a product of three components: two binary matrices and a nonnegative real matrix. In contrast to traditional bifactorization, trifactorization can better reveal latent structures among samples and features. Specifically, an IBP prior is imposed on two infinite binary matrices, while a truncated Gaussian distribution is assumed on the weight matrix. To optimize the model, we develop a modified variational-Bayesian algorithm, with iteration complexity one order lower than the recently proposed maximization-expectation-IBP model <xref ref-type="bibr" rid="ref1">[1] and the correlated IBP-IBP model <xref ref-type="bibr" rid="ref2">[2] . A series of simulation experiments are carried out, both qualitatively and quantitatively, using benchmark feature extraction, reconstruction, and clustering tasks. Comparative results show that our proposed iNBMT model significantly outperforms state-of-the-art algorithms on a range of synthetic and real-world data. The new Bayesian model can thus serve as a benchmark technique for the computational intelligence research community. |
|---|---|
| AbstractList | Nonnegative matrix factorization (NMF) has been widely exploited in many computational intelligence and pattern recognition problems. In particular, it can be used to extract latent features from data. However, previous NMF models often assume a fixed number of features, which are normally tuned and searched using a trial and error approach. Learning binary features is also difficult, since the binary matrix posits a more challenging optimization problem. In this paper, we propose a new Bayesian model, termed the infinite nonnegative binary matrix trifactorization (iNBMT) model. This can automatically learn both latent binary features and feature numbers, based on the Indian buffet process (IBP). It exploits a trifactorization process that decomposes the nonnegative matrix into a product of three components: two binary matrices and a nonnegative real matrix. In contrast to traditional bifactorization, trifactorization can better reveal latent structures among samples and features. Specifically, an IBP prior is imposed on two infinite binary matrices, while a truncated Gaussian distribution is assumed on the weight matrix. To optimize the model, we develop a modified variational-Bayesian algorithm, with iteration complexity one order lower than the recently proposed maximization-expectation-IBP model [1] and the correlated IBP-IBP model [2] . A series of simulation experiments are carried out, both qualitatively and quantitatively, using benchmark feature extraction, reconstruction, and clustering tasks. Comparative results show that our proposed iNBMT model significantly outperforms state-of-the-art algorithms on a range of synthetic and real-world data. The new Bayesian model can thus serve as a benchmark technique for the computational intelligence research community. Nonnegative matrix factorization (NMF) has been widely exploited in many computational intelligence and pattern recognition problems. In particular, it can be used to extract latent features from data. However, previous NMF models often assume a fixed number of features, which are normally tuned and searched using a trial and error approach. Learning binary features is also difficult, since the binary matrix posits a more challenging optimization problem. In this paper, we propose a new Bayesian model, termed the infinite nonnegative binary matrix trifactorization (iNBMT) model. This can automatically learn both latent binary features and feature numbers, based on the Indian buffet process (IBP). It exploits a trifactorization process that decomposes the nonnegative matrix into a product of three components: two binary matrices and a nonnegative real matrix. In contrast to traditional bifactorization, trifactorization can better reveal latent structures among samples and features. Specifically, an IBP prior is imposed on two infinite binary matrices, while a truncated Gaussian distribution is assumed on the weight matrix. To optimize the model, we develop a modified variational-Bayesian algorithm, with iteration complexity one order lower than the recently proposed maximization-expectation-IBP model <xref ref-type="bibr" rid="ref1">[1] and the correlated IBP-IBP model <xref ref-type="bibr" rid="ref2">[2] . A series of simulation experiments are carried out, both qualitatively and quantitatively, using benchmark feature extraction, reconstruction, and clustering tasks. Comparative results show that our proposed iNBMT model significantly outperforms state-of-the-art algorithms on a range of synthetic and real-world data. The new Bayesian model can thus serve as a benchmark technique for the computational intelligence research community. |
| Author | Hussain, Amir Zhang, Rui Yang, Xi Huang, Kaizhu |
| Author_xml | – sequence: 1 givenname: Xi orcidid: 0000-0002-8600-2570 surname: Yang fullname: Yang, Xi email: Xi.Yang@xjtlu.edu.cn organization: Xi’an Jiaotong-Liverpool University, Suzhou, Jiangsu, China – sequence: 2 givenname: Kaizhu orcidid: 0000-0002-3034-9639 surname: Huang fullname: Huang, Kaizhu email: Kaizhu.Huang@xjtlu.edu.cn organization: Xi’an Jiaotong-Liverpool University, Suzhou, Jiangsu, China – sequence: 3 givenname: Rui surname: Zhang fullname: Zhang, Rui email: Rui.Zhang02@xjtlu.edu.cn organization: Xi’an Jiaotong-Liverpool University, Suzhou, Jiangsu, China – sequence: 4 givenname: Amir orcidid: 0000-0002-8080-082X surname: Hussain fullname: Hussain, Amir email: ahu@cs.stir.ac.uk organization: Division of Computing Science & Maths, School of Natural Sciences, University of Stirling, Stirling, U.K |
| BookMark | eNp9kFtLAzEQhYNUsNb-AX0J-Lx1ctlLHrVYLayKsKBvIc0mNaVmazYV9de7vSDig08zMOebOXOOUc833iB0SmBECIiL6roaT0cUSDGiBWSC8QPUpzwnCS3S596v_ggN23YBAFSkhKW8jx5Lo4J3fo5LFY2PeGJUXAfT4icXX_DUW-ddNPi-8d7MVXTvBl85r8InvlMxuA9cBWeVjk1wX9248Sfo0Kpla4b7OkDVpDN4m5QPN9PxZZno7nhMiGa2tgxqTWvNZ3YmBKczDUIXRpHMQgpc5EITUnNCiU1tntdpTmsugCpgA3S-W7sKzdvatFEumnXw3UVJqcgIFAJYpyp2Kh2atg3GSu3i1mYMyi0lAbmJUG4jlJsI5T7CDqV_0FVwr93j_0NnO8gZY36AglEQWcq-ARV7f0Q |
| CODEN | ITETCU |
| CitedBy_id | crossref_primary_10_3390_rs11212531 crossref_primary_10_1016_j_neunet_2020_06_003 crossref_primary_10_1016_j_neucom_2019_11_070 crossref_primary_10_1109_ACCESS_2019_2899721 crossref_primary_10_1109_TETCI_2021_3104330 crossref_primary_10_1007_s12559_018_9565_x crossref_primary_10_1109_ACCESS_2020_3016981 crossref_primary_10_1016_j_comnet_2021_108088 crossref_primary_10_1177_01423312231217767 crossref_primary_10_1016_j_neucom_2019_01_031 crossref_primary_10_1186_s41044_018_0037_9 crossref_primary_10_3390_math9111189 |
| Cites_doi | 10.1162/neco.2007.19.7.1897 10.1038/44565 10.1007/s10994-013-5379-y 10.1016/j.neucom.2016.07.004 10.1109/CVPR.2017.217 10.1109/CVPRW.2016.144 10.1016/j.ijar.2016.12.010 10.1007/s00521-012-0956-8 10.1007/978-3-540-79452-3 10.1109/TPAMI.2014.2321387 10.1007/s10618-009-0145-2 10.1007/978-3-319-46687-3_65 10.1016/j.ipm.2009.12.007 10.1137/060649021 10.1145/1150402.1150420 10.1162/neco.2008.04-07-504 10.1016/j.eswa.2017.01.019 10.1109/TETCI.2017.2743219 10.1109/TIP.2014.2317981 10.1089/cmb.2012.0273 10.1109/TPAMI.2013.223 10.1007/978-3-540-74494-8_48 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018 |
| DBID | 97E RIA RIE AAYXX CITATION 7SP 8FD L7M |
| DOI | 10.1109/TETCI.2018.2806934 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Electronics & Communications Abstracts Technology Research Database Advanced Technologies Database with Aerospace |
| DatabaseTitle | CrossRef Technology Research Database Advanced Technologies Database with Aerospace Electronics & Communications Abstracts |
| DatabaseTitleList | Technology Research Database |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| EISSN | 2471-285X |
| EndPage | 463 |
| ExternalDocumentID | 10_1109_TETCI_2018_2806934 8320965 |
| Genre | orig-research |
| GrantInformation_xml | – fundername: Key Program Special Fund in XJTLU grantid: KSF-A-01 – fundername: Natural Science Fund for Colleges and Universities in Jiangsu Province grantid: 17KJD520010 – fundername: Jiangsu University Natural Science Research Programme grantid: 17KJB520041 – fundername: National Natural Science Foundation of China grantid: 61473236 funderid: 10.13039/501100001809 – fundername: UK Engineering and Physical Sciences Research Council grantid: EP/M026981/1 – fundername: Suzhou Science and Technology Program grantid: SYG201712; SZS201613 |
| GroupedDBID | 0R~ 97E AAJGR AASAJ AAWTH ABAZT ABJNI ABQJQ ABVLG ACGFS AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD IFIPE JAVBF OCL RIA RIE AAYXX CITATION 7SP 8FD L7M |
| ID | FETCH-LOGICAL-c295t-1c3fdf30dc2dc4bfb9942bc09c8ea16f0504979c11d4121f5f77d572d4902a03 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 13 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000679687200004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 2471-285X |
| IngestDate | Sun Sep 07 03:42:49 EDT 2025 Tue Nov 18 20:44:38 EST 2025 Sat Nov 29 05:12:06 EST 2025 Wed Aug 27 02:53:56 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 6 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c295t-1c3fdf30dc2dc4bfb9942bc09c8ea16f0504979c11d4121f5f77d572d4902a03 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0002-8080-082X 0000-0002-3034-9639 0000-0002-8600-2570 |
| PQID | 2296108903 |
| PQPubID | 4437216 |
| PageCount | 14 |
| ParticipantIDs | proquest_journals_2296108903 crossref_citationtrail_10_1109_TETCI_2018_2806934 crossref_primary_10_1109_TETCI_2018_2806934 ieee_primary_8320965 |
| PublicationCentury | 2000 |
| PublicationDate | 2018-12-01 |
| PublicationDateYYYYMMDD | 2018-12-01 |
| PublicationDate_xml | – month: 12 year: 2018 text: 2018-12-01 day: 01 |
| PublicationDecade | 2010 |
| PublicationPlace | Piscataway |
| PublicationPlace_xml | – name: Piscataway |
| PublicationTitle | IEEE transactions on emerging topics in computational intelligence |
| PublicationTitleAbbrev | TETCI |
| PublicationYear | 2018 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref15 ghahramani (ref30) 0; 13 ref36 ref14 krause (ref22) 0; 11 ref31 ding (ref10) 0 ref11 huang (ref6) 2008 doshi-velez (ref16) 0 reed (ref1) 0 ref39 ref17 pan (ref27) 0 ref38 attias (ref29) 0; 12 ref18 lee (ref3) 0 griffiths (ref21) 0; 18 kulis (ref37) 2009; 10 luo (ref9) 0; 10 miller (ref19) 0 ref23 ref26 griffiths (ref32) 2011; 12 ref20 ref42 ref41 ref44 ref43 lee (ref4) 1999; 401 rai (ref33) 0 ref8 ref7 williamson (ref34) 0 ghahramani (ref28) 0; 12 doshi-velez (ref2) 0 nene (ref35) 1996 ref40 wang (ref12) 0 miller (ref25) 0; 22 görür (ref24) 0 li (ref5) 0 |
| References_xml | – start-page: 143 year: 0 ident: ref2 article-title: Correlated non-parametric latent feature models publication-title: Proc 25th Conf Uncertainty Artif Intell – ident: ref17 doi: 10.1162/neco.2007.19.7.1897 – volume: 401 start-page: 788 year: 1999 ident: ref4 article-title: Learning the parts of objects by non-negative matrix factorization publication-title: Nature doi: 10.1038/44565 – start-page: 361 year: 0 ident: ref24 article-title: A choice model with infinitely many latent features publication-title: Proc 23rd Int Conf Mach Learn – ident: ref38 doi: 10.1007/s10994-013-5379-y – start-page: 403 year: 0 ident: ref19 article-title: The phylogenetic indian buffet process: A non-exchangeable nonparametric prior for latent features publication-title: Proc 24th Conf Uncertainty Artif Intell – volume: 12 start-page: 1185 year: 2011 ident: ref32 article-title: The indian buffet process: An introduction and review publication-title: J Mach Learn Res – start-page: 924 year: 0 ident: ref34 article-title: Dependent indian buffet processes publication-title: Proc Int Conf Artif Intell Statist – ident: ref42 doi: 10.1016/j.neucom.2016.07.004 – ident: ref44 doi: 10.1109/CVPR.2017.217 – volume: 18 start-page: 475 year: 0 ident: ref21 article-title: Infinite latent feature models and the indian buffet process publication-title: Proc Adv Neural Inform Process Syst – ident: ref15 doi: 10.1109/CVPRW.2016.144 – ident: ref26 doi: 10.1016/j.ijar.2016.12.010 – ident: ref39 doi: 10.1007/s00521-012-0956-8 – start-page: 1553 year: 0 ident: ref12 article-title: Fast nonnegative matrix tri-factorization for large-scale data co-clustering publication-title: Proc 22nd Int Joint Conf Artif Intell – year: 2008 ident: ref6 publication-title: Machine Learning Modeling Data Locally and Globally doi: 10.1007/978-3-540-79452-3 – start-page: 1013 year: 0 ident: ref1 article-title: Scaling the indian buffet process via submodular maximization publication-title: Proc 30th Int Conf Mach Learn – ident: ref31 doi: 10.1109/TPAMI.2014.2321387 – volume: 11 start-page: 231 year: 0 ident: ref22 article-title: Identifying protein complexes in high-throughput protein interaction screens using an infinite latent feature model publication-title: Proc Pacific Symp Biocomputing – volume: 10 start-page: 341 year: 2009 ident: ref37 article-title: Low-rank kernel learning with Bregman matrix divergences publication-title: J Mach Learn Res – ident: ref11 doi: 10.1007/s10618-009-0145-2 – volume: 12 start-page: 449 year: 0 ident: ref28 article-title: Variational inference for Bayesian mixtures of factor analysers publication-title: Proc Adv Neural Inf Process Syst – ident: ref18 doi: 10.1007/978-3-319-46687-3_65 – ident: ref7 doi: 10.1016/j.ipm.2009.12.007 – ident: ref36 doi: 10.1137/060649021 – ident: ref40 doi: 10.1145/1150402.1150420 – start-page: 149 year: 0 ident: ref5 article-title: Nonnegative matrix factorizations for clustering: A survey publication-title: Proc Data Clustering Algorithms Appl – volume: 10 start-page: 1273 year: 0 ident: ref9 article-title: An efficient non-negative matrix-factorization-based approach to collaborative filtering for recommender systems publication-title: IEEE Trans Ind Informat – ident: ref23 doi: 10.1162/neco.2008.04-07-504 – start-page: 126 year: 0 ident: ref10 article-title: Orthogonal nonnegative matrix tri-factorizations for clustering publication-title: Proc Int Conf on Knowledge Discovery and Data Mining – ident: ref13 doi: 10.1016/j.eswa.2017.01.019 – start-page: 556 year: 0 ident: ref3 article-title: Algorithms for non-negative matrix factorization publication-title: Proc Adv Neural Inform Process Syst Conf – ident: ref8 doi: 10.1109/TETCI.2017.2743219 – ident: ref14 doi: 10.1109/TIP.2014.2317981 – volume: 22 start-page: 1276 year: 0 ident: ref25 article-title: Nonparametric latent feature models for link prediction publication-title: Proc Adv Neural Inf Process Syst – start-page: 1518 year: 0 ident: ref33 article-title: Multi-label prediction via sparse infinite CCA publication-title: Proc Adv Neural Inf Process Syst Conf – ident: ref41 doi: 10.1089/cmb.2012.0273 – start-page: 137 year: 0 ident: ref16 article-title: Variational inference for the indian buffet process publication-title: Proc Int Conf Artif Intell Statist – year: 1996 ident: ref35 article-title: Columbia object image library (coil-20) – volume: 12 start-page: 209 year: 0 ident: ref29 article-title: A variational Baysian framework for graphical models publication-title: Proc Adv Neural Inf Process Syst – ident: ref43 doi: 10.1109/TPAMI.2013.223 – ident: ref20 doi: 10.1007/978-3-540-74494-8_48 – volume: 13 start-page: 507 year: 0 ident: ref30 article-title: Propagation algorithms for variational Bayesian learning publication-title: Proc Adv Neural Inf Process Syst – start-page: 2027 year: 0 ident: ref27 article-title: Robust non-negative dictionary learning publication-title: Proc 28th Conf Artif Intell |
| SSID | ssj0002951354 |
| Score | 2.1613035 |
| Snippet | Nonnegative matrix factorization (NMF) has been widely exploited in many computational intelligence and pattern recognition problems. In particular, it can be... |
| SourceID | proquest crossref ieee |
| SourceType | Aggregation Database Enrichment Source Index Database Publisher |
| StartPage | 450 |
| SubjectTerms | Algorithms Artificial intelligence Bayes methods Bayesian analysis Benchmarks Clustering Clustering methods Complexity theory Computational intelligence Computational modeling Computer simulation Feature extraction Gaussian distribution Indian Buffet Process prior Infinite latent feature model Infinite non-negative binary matrix tri-factori-zation Iterative methods Learning Matrix decomposition Normal distribution Optimization Pattern recognition Simulation |
| Title | Learning Latent Features With Infinite Nonnegative Binary Matrix Trifactorization |
| URI | https://ieeexplore.ieee.org/document/8320965 https://www.proquest.com/docview/2296108903 |
| Volume | 2 |
| WOSCitedRecordID | wos000679687200004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2471-285X dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0002951354 issn: 2471-285X databaseCode: RIE dateStart: 20170101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwFA5zePDiD6Y4nZKDN-3WpMmSHFUcDnQoFNyttPkxB9LJ1ol_vkmaFUQRvPWQpM2XNO9L8t73ALiQXCG75rGo4FhHhCoWWaNXRNRYci-NMJgpn2yCTSZ8OhVPLXDVxMJorb3zme67R3-XrxZy7Y7KBnb2ObGSLbDF2LCO1WrOU7ClCgklm7iYWAzSu_R27Jy3eN9dH4qEfLM9PpnKjxXYm5XR3v8-aB_sBvoIr-vxPgAtXXbAcxBJncEHyxzLCjpet7b7aPgyr17huDRzxyzhxDm1zLzSN7zxcbjw0Sn0f8J0Oa8T74SozEOQjmy_7qOQKiGStuNVhGRilEliJbGSpDCFEAQXMhaS6xwNTUztToAJiZAiCCNDDWOKMqyIiHEeJ0egXS5KfQwgF4YhrKi1bDmhhvChoFRLnhv7BkumugBtMMxkkBF32SzeMr-diEXmcc8c7lnAvQsumzrvtYjGn6U7DummZAC5C3qbocrCf7bKMBaW_3ERJye_1zoFO67t2gGlB9rVcq3PwLb8qOar5bmfQl_k98Ud |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LSwMxEB60CnrxgYr1mYM3XU2ySZMcVRSLtSgs6G3p5lELskrdij_fJLstiCJ420NCNl8e8yWZ-QbgSEtD_J4nkkJSmzBuROKNXpFw58m9dspRYWKyCdHvy6cndT8HJ7NYGGttdD6zp-EzvuWbVz0JV2VnfvYFsZJ5WOCMUVxHa81uVKgnCyln08gYrM6yq-yyG9y35Gl4QFQp-2Z9YjqVH3twNCzXq__7pTVYaQgkOq9HfB3mbLkBD41M6hD1PHcsKxSY3cSfpNHjqHpG3dKNArdE_eDWMoxa3-giRuKiu6DR_4my8ahOvdPEZW5Cdu37dZM0yRIS7TteJUSnzrgUG02NZoUrlGK00FhpaQek4zD3ZwGhNCGGEUocd0IYLqhhCtMBTregVb6WdhuQVE4Qari3bQPGHZMdxbnVcuB8C55OtYFMMcx1IyQe8lm85PFAgVUecc8D7nmDexuOZ3XeahmNP0tvBKRnJRuQ27A3Haq8WWnvOaXKM0CpcLrze61DWLrJ7np5r9u_3YXl0E7tjrIHrWo8sfuwqD-q0fv4IE6nL09ryGQ |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Learning+Latent+Features+With+Infinite+Nonnegative+Binary+Matrix+Trifactorization&rft.jtitle=IEEE+transactions+on+emerging+topics+in+computational+intelligence&rft.au=Yang%2C+Xi&rft.au=Huang%2C+Kaizhu&rft.au=Zhang%2C+Rui&rft.au=Hussain%2C+Amir&rft.date=2018-12-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.eissn=2471-285X&rft.volume=2&rft.issue=6&rft.spage=450&rft_id=info:doi/10.1109%2FTETCI.2018.2806934&rft.externalDBID=NO_FULL_TEXT |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2471-285X&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2471-285X&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2471-285X&client=summon |