Robust estimation of discrete hidden Markov model parameters using the entropy-based feature-parameter weighting and source-quantization modeling

We propose a new variant of the discrete hidden Markov model (DHMM) in which the output distribution is estimated by state-dependent source quantizing modeling and the output probability is weighted by the entropy of each feature-parameter at a state. The state-dependent source is represented as a s...

Full description

Saved in:
Bibliographic Details
Published in:Artificial intelligence in engineering Vol. 12; no. 3; pp. 243 - 252
Main Authors: Choi, Hwan Jin, Yun, Sung Jin, Oh, Yung Hwan
Format: Journal Article
Language:English
Published: Oxford Elsevier Ltd 01.07.1998
Elsevier
Subjects:
ISSN:0954-1810
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a new variant of the discrete hidden Markov model (DHMM) in which the output distribution is estimated by state-dependent source quantizing modeling and the output probability is weighted by the entropy of each feature-parameter at a state. The state-dependent source is represented as a state-dependent quantized vector which is regarded as a variant of a representative vector at a state and its own codeword distribution, and the output distribution is derived by these state-dependent sources which will exist at a state. In addition, entropy-based feature-parameter weighting is proposed to reflect the different importance of each feature-parameter in a state, and the fuzzy function is applied to transform an entropy value into a feature-parameter weighting factor. From experiments, we found that proposed methods have shown an improvement of 5.6%, which indicates the effectiveness of proposed models in the robust estimation of output probabilities for DHMMs.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0954-1810
DOI:10.1016/S0954-1810(97)00026-5