Continuous-time Markov Process - Mathematical Definitions

Mathematical Definitions

A Markov process, like a Markov chain, can be thought of as a directed graph of states of the system. The difference is that, rather than transitioning to a new (possibly the same) state at each time step, the system will remain in the current state for some random (in particular, exponentially distributed) amount of time and then transition to a different state. The process is characterized by "transition rates" between states i and j. Let X(t) be the random variable describing the state of the process at time t, and assume that the process is in a state i at time t. (for ) measures how quickly that transition happens. Precisely, after a tiny amount of time h, the probability the state is now j is given by

where o(h) represents a quantity that goes to zero faster than h goes to zero (see the article on order notation). Hence, over a sufficiently small interval of time, the probability of a particular transition (between different states) is roughly proportional to the duration of that interval. The are called transition rates because if we have a large ensemble of n systems in state i, they will switch over to state j at an average rate of until n decreases appreciably.

The transition rates are typically given as the ij-th elements of the transition rate matrix Q (also known as an intensity matrix). As the transition rate matrix contains rates, the rate of departing from one state to arrive at another should be positive, and the rate that the system remains in a state should be negative. The rates for a given state should sum to zero, yielding the diagonal elements to be

With this notation, and letting, the evolution of a continuous-time Markov process is given by the first-order differential equation

The probability that no transition happens in some time r is

That is, the probability distribution of the waiting time until the first transition is an exponential distribution with rate parameter, and continuous-time Markov processes are thus memoryless processes.

A time dependent (time heterogeneous) Markov process is a Markov process as above, but with the q-rate a function of time, denoted qij(t).

Read more about this topic:  Continuous-time Markov Process

Famous quotes containing the words mathematical and/or definitions:

    As we speak of poetical beauty, so ought we to speak of mathematical beauty and medical beauty. But we do not do so; and that reason is that we know well what is the object of mathematics, and that it consists in proofs, and what is the object of medicine, and that it consists in healing. But we do not know in what grace consists, which is the object of poetry.
    Blaise Pascal (1623–1662)

    The loosening, for some people, of rigid role definitions for men and women has shown that dads can be great at calming babies—if they take the time and make the effort to learn how. It’s that time and effort that not only teaches the dad how to calm the babies, but also turns him into a parent, just as the time and effort the mother puts into the babies turns her into a parent.
    Pamela Patrick Novotny (20th century)