Bayesian computational methods for Hidden Markov Models

Samson L Ghebremariam, University of Texas at El Paso

Abstract

Hidden Markov Models (HMMs) have been applied to many real-world problems. Hidden Markov modeling has recently become increasingly important and popular among researchers, and many software tools are based on them. Given that the models are rich in mathematical structure, they can form theoretical foundation for use in a wide range of applications. Hidden Markov models provide a universal configuration for statistical analysis of a large variety of DNA sequences containing symbols A, C, G, T. In a HMM, it is impossible to figure out what state the model is in by just having a look at the symbol generated. A Bayesian Analysis using Markov chain Monte Carlo (MCMC) sampling techniques can be implemented to simulate the hidden Markov model parameters from their posterior distribution given the observed data. In this paper we adopt a Gibbs sampling algorithm to sample the hidden states and the model parameters from their posterior distribution. Unobservable variables are used as the hidden states indicators. We propose finite mixtures of hidden Markov models with component weights that depend on time. In addition to the hidden state indicators, latent variables are used as mixture indicators. To get a better fit we divide the data into non-overlapping segments of equal lengths such that all the observations within the same segment belong to the same component. The results are compared with the standard MLE estimates through simulation studies and real data and are shown to be better.

Subject Area

Statistics

Recommended Citation

Ghebremariam, Samson L, "Bayesian computational methods for Hidden Markov Models" (2011). ETD Collection for University of Texas, El Paso. AAI1498288.
https://scholarworks.utep.edu/dissertations/AAI1498288

Share

COinS