Markov Chain - Markov Chains - Steady-state Analysis and Limiting Distributions - Steady-state Analysis and The Time-inhomogeneous Markov Chain

Steady-state Analysis and The Time-inhomogeneous Markov Chain

A Markov chain need not necessarily be time-homogeneous to have an equilibrium distribution. If there is a probability distribution over states such that

for every state j and every time n then is an equilibrium distribution of the Markov chain. Such can occur in Markov chain Monte Carlo (MCMC) methods in situations where a number of different transition matrices are used, because each is efficient for a particular kind of mixing, but each matrix respects a shared equilibrium distribution.

Read more about this topic:  Markov Chain, Markov Chains, Steady-state Analysis and Limiting Distributions

Famous quotes containing the words chain and/or analysis:

    By this unprincipled facility of changing the state as often, and as much, and in as many ways as there are floating fancies or fashions, the whole chain and continuity of the commonwealth would be broken. No one generation could link with the other. Men would become little better than the flies of a summer.
    Edmund Burke (1729–1797)

    ... the big courageous acts of life are those one never hears of and only suspects from having been through like experience. It takes real courage to do battle in the unspectacular task. We always listen for the applause of our co-workers. He is courageous who plods on, unlettered and unknown.... In the last analysis it is this courage, developing between man and his limitations, that brings success.
    Alice Foote MacDougall (1867–1945)