Simultaneous Clustering and Model Selection: Algorithm, Theory and Applications
While clustering has been well studied in the past decade, model selection has drawn much less attention due to the difficulty of the problem. In this paper, we address both problems in a joint manner by recovering an ideal affinity tensor from an imperfect input. By taking into account the relation...
Saved in:
| Published in: | IEEE transactions on pattern analysis and machine intelligence Vol. 40; no. 8; pp. 1964 - 1978 |
|---|---|
| Main Authors: | , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
United States
IEEE
01.08.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects: | |
| ISSN: | 0162-8828, 1939-3539, 2160-9292, 1939-3539 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | While clustering has been well studied in the past decade, model selection has drawn much less attention due to the difficulty of the problem. In this paper, we address both problems in a joint manner by recovering an ideal affinity tensor from an imperfect input. By taking into account the relationship of the affinities induced by the cluster structures, we are able to significantly improve the affinity input, such as repairing those entries corrupted by gross outliers. More importantly, the recovered ideal affinity tensor also directly indicates the number of clusters and their membership, thus solving the model selection and clustering jointly. To enforce the requisite global consistency in the affinities demanded by the cluster structure, we impose a number of constraints, specifically, among others, the tensor should be low rank and sparse, and it should obey what we call the rank-1 sum constraint. To solve this highly non-smooth and non-convex problem, we exploit the mathematical structures, and express the original problem in an equivalent form amenable for numerical optimization and convergence analysis. To scale to large problem sizes, we also propose an alternative formulation, so that those problems can be efficiently solved via stochastic optimization in an online fashion. We evaluate our algorithm with different applications to demonstrate its superiority, and show it can adapt to a large variety of settings. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ISSN: | 0162-8828 1939-3539 2160-9292 1939-3539 |
| DOI: | 10.1109/TPAMI.2017.2739147 |