Expectation–Maximization algorithm for finite mixture of α-stable distributions

A Gaussian Mixture Model (GMM) is a parametric probability density function built as a weighted sum of Gaussian distributions. Gaussian mixtures are used for modelling the probability distribution in many fields of research nowadays. Nevertheless, in many real applications, the components are skewed...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Neurocomputing (Amsterdam) Ročník 413; s. 210 - 216
Hlavní autori: Castillo-Barnes, D., Martinez-Murcia, F.J., Ramírez, J., Górriz, J.M., Salas-Gonzalez, D.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Elsevier B.V 06.11.2020
Predmet:
ISSN:0925-2312, 1872-8286
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:A Gaussian Mixture Model (GMM) is a parametric probability density function built as a weighted sum of Gaussian distributions. Gaussian mixtures are used for modelling the probability distribution in many fields of research nowadays. Nevertheless, in many real applications, the components are skewed or heavy tailed. For that reason, it is useful to model the mixtures as components with α-stable distribution. In this work, we present a mixture of skewed α-stable model where the parameters are estimated using the Expectation–Maximization algorithm. As the Gaussian distribution is a particular limiting case of α-stable distribution, the proposed model is a generalization of the widely used GMM. The proposed algorithm is much faster than the parameter estimation of the α-stable mixture model using a Bayesian approach and Markov chain Monte Carlo methods. Therefore, it is more suitable to be used for large vector observations.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2020.06.114