Markov Decision Process - Definition

Definition

A Markov decision process is a 4-tuple, where

  • is a finite set of states,
  • is a finite set of actions (alternatively, is the finite set of actions available from state ),
  • is the probability that action in state at time will lead to state at time ,
  • is the immediate reward (or expected immediate reward) received after transition to state from state with transition probability .

(The theory of Markov decision processes does not actually require or to be finite, but the basic algorithms below assume that they are finite.)

Read more about this topic:  Markov Decision Process

Famous quotes containing the word definition:

    Although there is no universal agreement as to a definition of life, its biological manifestations are generally considered to be organization, metabolism, growth, irritability, adaptation, and reproduction.
    The Columbia Encyclopedia, Fifth Edition, the first sentence of the article on “life” (based on wording in the First Edition, 1935)

    Mothers often are too easily intimidated by their children’s negative reactions...When the child cries or is unhappy, the mother reads this as meaning that she is a failure. This is why it is so important for a mother to know...that the process of growing up involves by definition things that her child is not going to like. Her job is not to create a bed of roses, but to help him learn how to pick his way through the thorns.
    Elaine Heffner (20th century)

    ... if, as women, we accept a philosophy of history that asserts that women are by definition assimilated into the male universal, that we can understand our past through a male lens—if we are unaware that women even have a history—we live our lives similarly unanchored, drifting in response to a veering wind of myth and bias.
    Adrienne Rich (b. 1929)