A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property. Markov chains have many applications as statistical models of real-world processes.
Read more about Markov Chain: Introduction, Formal Definition, Markov Chains, Finite State Space, Reversible Markov Chain, Bernoulli Scheme, General State Space, Applications, Fitting, History
Famous quotes containing the word chain:
“The name of the town isnt important. Its the one thats just twenty-eight minutes from the big city. Twenty-three if you catch the morning express. Its on a river and its got houses and stores and churches. And a main street. Nothing fancy like Broadway or Market, just plain Broadway. Drug, dry good, shoes. Those horrible little chain stores that breed like rabbits.”
—Joseph L. Mankiewicz (19091993)