Formal Definition
A Markov chain is a sequence of random variables X1, X2, X3, ... with the Markov property, namely that, given the present state, the future and past states are independent. Formally,
The possible values of Xi form a countable set S called the state space of the chain.
Markov chains are often described by a directed graph, where the edges are labeled by the probabilities of going from one state to the other states.
Read more about this topic: Markov Chain
Famous quotes containing the words formal and/or definition:
“On every formal visit a child ought to be of the party, by way of provision for discourse.”
—Jane Austen (17751817)
“The man who knows governments most completely is he who troubles himself least about a definition which shall give their essence. Enjoying an intimate acquaintance with all their particularities in turn, he would naturally regard an abstract conception in which these were unified as a thing more misleading than enlightening.”
—William James (18421910)