Markov Chain - Markov Chains

Markov Chains

The probability of going from state i to state j in n time steps is

and the single-step transition is

For a time-homogeneous Markov chain:

and

The n-step transition probabilities satisfy the Chapmanā€“Kolmogorov equation, that for any k such that 0 < k < n,

where S is the state space of the Markov chain.

The marginal distribution Pr(Xn = x) is the distribution over states at time n. The initial distribution is Pr(X0 = x). The evolution of the process through one time step is described by

Note: The superscript (n) is an index and not an exponent.

Read more about this topic:  Markov Chain

Famous quotes containing the word chains:

    ... there are no chains so galling as the chains of ignorance—no fetters so binding as those that bind the soul, and exclude it from the vast field of useful and scientific knowledge. O, had I received the advantages of early education, my ideas would, ere now, have expanded far and wide; but, alas! I possess nothing but moral capability—no teachings but the teachings of the Holy Spirit.
    Maria Stewart (1803–1879)