Conditional mixture modeling and model-based clustering

•Proposed a novel family of finite mixture models that•can model non-compact clusters•can model each cluster individually without imposing common component structure assumptions•Developed an algorithm for a speedy search of the conditioning order•Illustrated the developed methodology on simulated an...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Pattern recognition Ročník 133; s. 108994
Hlavní autori: Melnykov, Volodymyr, Wang, Yang
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Elsevier Ltd 01.01.2023
Predmet:
ISSN:0031-3203, 1873-5142
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:•Proposed a novel family of finite mixture models that•can model non-compact clusters•can model each cluster individually without imposing common component structure assumptions•Developed an algorithm for a speedy search of the conditioning order•Illustrated the developed methodology on simulated and well-known classification data sets, with good results Due to a potentially high number of parameters, finite mixture models are often at the risk of overparameterization even for a moderate number of components. This can lead to overfitting individual components and result in mixture order underestimation. One of the most popular approaches to address this issue is to reduce the number of parameters by considering parsimonious models. The vast majority of techniques in this direction focuses on the reparameterization of covariance matrices associated with mixture components. We propose an alternative approach based on the parsimonious parameterization of location parameters that enjoys remarkable modeling flexibility especially in the presence of non-compact clusters. Due to an attractive closed form formulation, speedy parameter estimation is available by means of the EM algorithm. The utility of the proposed method is illustrated on synthetic and well-known classification data sets.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2022.108994