Viterbi Algorithm - Example

Example

Consider a primitive clinic in a village. People in the village have a very nice property that they are either healthy or have a fever. They can only tell if they have a fever by asking a doctor in the clinic. The wise doctor makes a diagnosis of fever by asking patients how they feel. Villagers only answer that they feel normal, dizzy, or cold.

Suppose a patient comes to the clinic each day and tells the doctor how she feels. The doctor believes that the health condition of this patient operates as a discrete Markov chain. There are two states, "Healthy" and "Fever", but the doctor cannot observe them directly, that is, they are hidden from him. On each day, there is a certain chance that the patient will tell the doctor he has one of the following feelings, depending on his health condition: "normal", "cold", or "dizzy". Those are the observations. The entire system is that of a hidden Markov model (HMM).

The doctor knows the villager's general health condition, and what symptoms patients complain of with or without fever on average. In other words, the parameters of the HMM are known. They can be represented as follows in the Python programming language:

states = ('Healthy', 'Fever') observations = ('normal', 'cold', 'dizzy') start_probability = {'Healthy': 0.6, 'Fever': 0.4} transition_probability = { 'Healthy' : {'Healthy': 0.7, 'Fever': 0.3}, 'Fever' : {'Healthy': 0.4, 'Fever': 0.6}, } emission_probability = { 'Healthy' : {'normal': 0.5, 'cold': 0.4, 'dizzy': 0.1}, 'Fever' : {'normal': 0.1, 'cold': 0.3, 'dizzy': 0.6}, }

In this piece of code, start_probability represents the doctor's belief about which state the HMM is in when the patient first visits (all he knows is that the patient tends to be healthy). The particular probability distribution used here is not the equilibrium one, which is (given the transition probabilities) approximately {'Healthy': 0.57, 'Fever': 0.43}. The transition_probability represents the change of the health condition in the underlying Markov chain. In this example, there is only a 30% chance that tomorrow the patient will have a fever if he is healthy today. The emission_probability represents how likely the patient is to feel on each day. If he is healthy, there is a 50% chance that he feels normal; if he has a fever, there is a 60% chance that he feels dizzy.

The patient visits three days in a row and the doctor discovers that on the first day he feels normal, on the second day he feels cold, on the third day he feels dizzy. The doctor has a question: what is the most likely sequence of health condition of the patient would explain these observations? This is answered by the Viterbi algorithm.

# Helps visualize the steps of Viterbi. def print_dptable(V): print " ", for i in range(len(V)): print "%7d" % i, print for y in V.keys: print "%.5s: " % y, for t in range(len(V)): print "%.7s" % ("%f" % V), print def viterbi(obs, states, start_p, trans_p, emit_p): V = path = {} # Initialize base cases (t == 0) for y in states: V = start_p * emit_p] path = # Run Viterbi for t > 0 for t in range(1,len(obs)): V.append({}) newpath = {} for y in states: (prob, state) = max( * trans_p * emit_p], y0) for y0 in states]) V = prob newpath = path + # Don't need to remember the old paths path = newpath print_dptable(V) (prob, state) = max(, y) for y in states]) return (prob, path)

The function viterbi takes the following arguments: obs is the sequence of observations, e.g. ; states is the set of hidden states; start_p is the start probability; trans_p are the transition probabilities; and emit_p are the emission probabilities. For simplicity of code, we assume that the observation sequence obs is non-empty and that trans_p and emit_p is defined for all states i,j.

In the running example, the forward/Viterbi algorithm is used as follows:

def example: return viterbi(observations, states, start_probability, transition_probability, emission_probability) print example

This reveals that the observations were most likely generated by states . In other words, given the observed activities, the patient was most likely to have been healthy both on the first day with he felt normal as well as on the second day when he felt cold, and then he contracted a fever the third day.

The operation of Viterbi's algorithm can be visualized by means of a trellis diagram. The Viterbi path is essentially the shortest path through this trellis. The trellis for the clinic example is shown below; the corresponding Viterbi path is in bold:

When implementing Viterbi's algorithm, it should be noted that many languages use floating point arithmetic - as p is small, this may lead to underflow in the results. A common technique to avoid this is to take the logarithm of the probabilities and use it throughout the computation, the same technique used in the Logarithmic Number System. Once the algorithm has terminated, an accurate value can be obtained by performing the appropriate exponentiation.

Read more about this topic:  Viterbi Algorithm

Famous quotes containing the word example:

    Our intellect is not the most subtle, the most powerful, the most appropriate, instrument for revealing the truth. It is life that, little by little, example by example, permits us to see that what is most important to our heart, or to our mind, is learned not by reasoning but through other agencies. Then it is that the intellect, observing their superiority, abdicates its control to them upon reasoned grounds and agrees to become their collaborator and lackey.
    Marcel Proust (1871–1922)