Markov Chain - Markov Chains - Steady-state Analysis and Limiting Distributions - Steady-state Analysis and The Time-inhomogeneous Markov Chain

Steady-state Analysis and The Time-inhomogeneous Markov Chain

A Markov chain need not necessarily be time-homogeneous to have an equilibrium distribution. If there is a probability distribution over states such that

for every state j and every time n then is an equilibrium distribution of the Markov chain. Such can occur in Markov chain Monte Carlo (MCMC) methods in situations where a number of different transition matrices are used, because each is efficient for a particular kind of mixing, but each matrix respects a shared equilibrium distribution.

Read more about this topic:  Markov Chain, Markov Chains, Steady-state Analysis and Limiting Distributions

Famous quotes containing the words chain and/or analysis:

    It is the future that creates his present.
    All is an interminable chain of longing.
    Robert Frost (1874–1963)

    Ask anyone committed to Marxist analysis how many angels on the head of a pin, and you will be asked in return to never mind the angels, tell me who controls the production of pins.
    Joan Didion (b. 1934)