Stochastic Matrix - Definition and Properties

Definition and Properties

A stochastic matrix describes a Markov chain over a finite state space S.

If the probability of moving from to in one time step is, the stochastic matrix P is given by using as the row and column element, e.g.,

P=\left(\begin{matrix}p_{1,1}&p_{1,2}&\dots&p_{1,j}&\dots\\
p_{2,1}&p_{2,2}&\dots&p_{2,j}&\dots\\
\vdots&\vdots&\ddots&\vdots&\ddots\\
p_{i,1}&p_{i,2}&\dots&p_{i,j}&\dots\\
\vdots&\vdots&\ddots&\vdots&\ddots
\end{matrix}\right).

Since the probability of transitioning from state to some state must be 1, this matrix is a right stochastic matrix, so that

The probability of transitioning from to in two steps is then given by the element of the square of :

In general the probability transition of going from any state to another state in a finite Markov chain given by the matrix in k steps is given by .

An initial distribution is given as a row vector.

A stationary probability vector is defined as a vector that does not change under application of the transition matrix; that is, it is defined as a left eigenvector of the probability matrix, associated with eigenvalue 1:

The Perron–Frobenius theorem ensures that every stochastic matrix has such a vector, and that the largest absolute value of an eigenvalue is always 1. In general, there may be several such vectors. However, for a matrix with strictly positive entries, this vector is unique and can be computed by observing that for any we have the following limit,

where is the element of the row vector . This implies that the long-term probability of being in a state is independent of the initial state . That either of these two computations give one and the same stationary vector is a form of an ergodic theorem, which is generally true in a wide variety of dissipative dynamical systems: the system evolves, over time, to a stationary state. Intuitively, a stochastic matrix represents a Markov chain with no sink states, this implies that the application of the stochastic matrix to a probability distribution would redistribute the probability mass of the original distribution while preserving its total mass. If this process is applied repeatedly the distribution converges to a stationary distribution for the Markov chain.

Read more about this topic:  Stochastic Matrix

Famous quotes containing the words definition and/or properties:

    The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.
    Jean Baudrillard (b. 1929)

    A drop of water has the properties of the sea, but cannot exhibit a storm. There is beauty of a concert, as well as of a flute; strength of a host, as well as of a hero.
    Ralph Waldo Emerson (1803–1882)