Layered Hidden Markov Model - The Layered Hidden Markov Model

The Layered Hidden Markov Model

A layered hidden Markov model (LHMM) consists of levels of HMMs where the HMMs on level corresponds to observation symbols or probability generators at level . Every level of the LHMM consists of HMMs running in parallel.

At any given level in the LHMM a sequence of observation symbols can be used to classify the input into one of classes, where each class corresponds to each of the HMMs at level . This classification can then be used to generate a new observation for the level HMMs. At the lowest layer, i.e. level, primitive observation symbols would be generated directly from observations of the modeled process. For example in a trajectory tracking task the primitive observation symbols would originate from the quantized sensor values. Thus at each layer in the LHMM the observations originate from the classification of the underlying layer, except for the lowest layer where the observation symbols originate from measurements of the observed process.

It is not necessary to run all levels at the same time granularity. For example it is possible to use windowing at any level in the structure so that the classification takes the average of several classifications into consideration before passing the results up the layers of the LHMM.

Instead of simply using the winning HMM at level as an input symbol for the HMM at level it is possible to use it as a probability generator by passing the complete probability distribution up the layers of the LHMM. Thus instead of having a "winner takes all" strategy where the most probable HMM is selected as an observation symbol, the likelihood of observing the th HMM can be used in the recursion formula of the level HMM to account for the uncertainty in the classification of the HMMs at level . Thus, if the classification of the HMMs at level is uncertain, it is possible to pay more attention to the a-priori information encoded in the HMM at level .

A LHMM could in practice be transformed into a single layered HMM where all the different models are concatenated together. Some of the advantages that may be expected from using the LHMM over a large single layer HMM is that the LHMM is less likely to suffer from overfitting since the individual sub-components are trained independently on smaller amounts of data. A consequence of this is that a significantly smaller amount of training data is required for the LHMM to achieve a performance comparable of the HMM. Another advantage is that the layers at the bottom of the LHMM, which are more sensitive to changes in the environment such as the type of sensors, sampling rate etc. can be retrained separately without altering the higher layers of the LHMM.

Read more about this topic:  Layered Hidden Markov Model

Famous quotes containing the words layered, hidden and/or model:

    Computer mediation seems to bathe action in a more conditional light: perhaps it happened; perhaps it didn’t. Without the layered richness of direct sensory engagement, the symbolic medium seems thin, flat, and fragile.
    Shoshana Zuboff (b. 1951)

    Our courage breaks like an old tree in a black wind and dies,
    But we have hidden in our hearts the flame out of the eyes
    Of Cathleen, the daughter of Houlihan.
    William Butler Yeats (1865–1939)

    If the man who paints only the tree, or flower, or other surface he sees before him were an artist, the king of artists would be the photographer. It is for the artist to do something beyond this: in portrait painting to put on canvas something more than the face the model wears for that one day; to paint the man, in short, as well as his features.
    James Mcneill Whistler (1834–1903)