Markov Chain

A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property. Markov chains have many applications as statistical models of real-world processes.

Read more about Markov Chain:  Introduction, Formal Definition, Markov Chains, Finite State Space, Reversible Markov Chain, Bernoulli Scheme, General State Space, Applications, Fitting, History

Famous quotes containing the word chain:

    The name of the town isn’t important. It’s the one that’s just twenty-eight minutes from the big city. Twenty-three if you catch the morning express. It’s on a river and it’s got houses and stores and churches. And a main street. Nothing fancy like Broadway or Market, just plain Broadway. Drug, dry good, shoes. Those horrible little chain stores that breed like rabbits.
    Joseph L. Mankiewicz (1909–1993)