Multi-View Robust Feature Learning for Data Clustering

Multi-view feature learning can provide basic information for consistent grouping, and is very common in practical applications, such as judicial document clustering. However, it is a challenge to combine multiple heterogeneous features to learn a comprehensive description of data samples. To solve...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE signal processing letters Ročník 27; s. 1750 - 1754
Hlavní autori: Zhao, Liang, Zhao, Tianyang, Sun, Tingting, Liu, Zhuo, Chen, Zhikui
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1070-9908, 1558-2361
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Multi-view feature learning can provide basic information for consistent grouping, and is very common in practical applications, such as judicial document clustering. However, it is a challenge to combine multiple heterogeneous features to learn a comprehensive description of data samples. To solve this problem, many methods explore the correlation between various features across views by assuming that all views share the same semantic information. Inspired by this, in this paper we propose a new multi-view robust feature learning (MRFL) method. In addition to projecting features from different views to a shared semantic subspace, our approach also learns the irrelevant information of data space to capture the feature dependencies between views in potential common subspaces. Therefore, the MRFL can obtain flexible feature associations hidden in multi-view data. A new objective function is designed to derive, and solve the effective optimization process of MRFL. Experiments on real-world multi-view datasets show that the proposed MRFL method is superior to the state-of-the-art multi-view learning methods.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2020.3026943