Hidden Markov Model - Learning

Learning

The parameter learning task in HMMs is to find, given an output sequence or a set of such sequences, the best set of state transition and output probabilities. The task is usually to derive the maximum likelihood estimate of the parameters of the HMM given the set of output sequences. No tractable algorithm is known for solving this problem exactly, but a local maximum likelihood can be derived efficiently using the Baum–Welch algorithm or the Baldi–Chauvin algorithm. The Baum–Welch algorithm is a special case of the expectation-maximization algorithm.

Read more about this topic:  Hidden Markov Model

Famous quotes containing the word learning:

    Experiences in order to be educative must lead out into an expanding world of subject matter, a subject matter of facts or information and of ideas. This condition is satisfied only as the educator views teaching and learning as a continuous process of reconstruction of experience.
    John Dewey (1859–1952)

    Without our being especially conscious of the transition, the word “parent” has gradually come to be used as much as a verb as a noun. Whereas we formerly thought mainly about “being a parent,” we now find ourselves talking about learning how “to parent.” . . . It suggests that we may now be concentrating on action rather than status, on what we do rather than what or who we are.
    Bettye M. Caldwell (20th century)

    Our day of dependence, our long apprenticeship to the learning of other lands, draws to a close. The millions, that around us are rushing into life, cannot always be fed on the sere remains of foreign harvests.
    Ralph Waldo Emerson (1803–1882)