Markov Chain - Reversible Markov Chain

Reversible Markov Chain

A Markov chain is said to be reversible if there is a probability distribution over states, π, such that

for all times n and all states i and j. This condition is also known as the detailed balance condition (some books refer the local balance equation). With a time-homogeneous Markov chain, Pr(Xn+1 = j | Xn = i) does not change with time n and it can be written more simply as . In this case, the detailed balance equation can be written more compactly as

Summing the original equation over i gives

so, for reversible Markov chains, π is always a steady-state distribution of Pr(Xn+1 = j | Xn = i) for every n.

If the Markov chain begins in the steady-state distribution, i.e., if Pr(X0 = i) = πi, then Pr(Xn = i) = πi for all n and the detailed balance equation can be written as

The left- and right-hand sides of this last equation are identical except for a reversing of the time indices n and n + 1.

Reversible Markov chains are common in Markov chain Monte Carlo (MCMC) approaches because the detailed balance equation for a desired distribution π necessarily implies that the Markov chain has been constructed so that π is a steady-state distribution. Even with time-inhomogeneous Markov chains, where multiple transition matrices are used, if each such transition matrix exhibits detailed balance with the desired π distribution, this necessarily implies that π is a steady-state distribution of the Markov chain.

Read more about this topic:  Markov Chain

Famous quotes containing the word chain:

    Loyalty to petrified opinions never yet broke a chain or freed a human soul in this world—and never will.
    Mark Twain [Samuel Langhorne Clemens] (1835–1910)