Embedded Markov Chain
One method of finding the stationary probability distribution, π, of an ergodic continuous-time Markov process, Q, is by first finding its embedded Markov chain (EMC). Strictly speaking, the EMC is a regular discrete-time Markov chain, sometimes referred to as a jump process. Each element of the one-step transition probability matrix of the EMC, S, is denoted by sij, and represents the conditional probability of transitioning from state i into state j. These conditional probabilities may be found by
From this, S may be written as
where DQ = diag{Q} is the diagonal matrix of Q.
To find the stationary probability distribution vector, we must next find such that
with being a row vector, such that all elements in are greater than 0 and = 1, and the 0 on the right side also being a row vector of 0's. From this, π may be found as
Note that S may be periodic, even if Q is not. Once π is found, it must be normalized to a unit vector.
Another discrete-time process that may be derived from a continuous-time Markov chain is a δ-skeleton—the (discrete-time) Markov chain formed by observing X(t) at intervals of δ units of time. The random variables X(0), X(δ), X(2δ), ... give the sequence of states visited by the δ-skeleton.
Read more about this topic: Continuous-time Markov Process
Famous quotes containing the words embedded and/or chain:
“It was your severed image that grew sweeter,
That floated, wing-stiff, focused in the sun
Along uncertainty and gales of shame
Blown out before I slept. Now you are one
I dare not think alive: only a name
That chimes occasionally, as a belief
Long since embedded in the static past.”
—Philip Larkin (19221986)
“Oh yes, thats right. They chain up wild animals. Thats all I am, an animal.”
—John Elder [Anthony Hinds], British screenwriter, and Terence Fisher. Leon (Oliver Reed)