Skip to content

Instantly share code, notes, and snippets.

@cheekybastard
Created July 16, 2014 03:03
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save cheekybastard/17f5b67ea8db4137b8ed to your computer and use it in GitHub Desktop.
Save cheekybastard/17f5b67ea8db4137b8ed to your computer and use it in GitHub Desktop.
Hidden Markov model example
#http://sujitpal.blogspot.com.au/2013/03/the-wikipedia-bob-alice-hmm-example.html
#http://en.wikipedia.org/wiki/Hidden_Markov_model
from __future__ import division
import numpy as np
from sklearn import hmm
states = ["Rainy", "Sunny"]
n_states = len(states)
observations = ["walk", "shop", "clean"]
n_observations = len(observations)
start_probability = np.array([0.6, 0.4])
transition_probability = np.array([
[0.7, 0.3],
[0.4, 0.6]
])
emission_probability = np.array([
[0.1, 0.4, 0.5],
[0.6, 0.3, 0.1]
])
model = hmm.MultinomialHMM(n_components=n_states)
model._set_startprob(start_probability)
model._set_transmat(transition_probability)
model._set_emissionprob(emission_probability)
# predict a sequence of hidden states based on visible states
bob_says = [0, 2, 1, 1, 2, 0]
logprob, alice_hears = model.decode(bob_says, algorithm="viterbi")
print "Bob says:", ", ".join(map(lambda x: observations[x], bob_says))
print "Alice hears:", ", ".join(map(lambda x: states[x], alice_hears))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment