Markov Decision Process - Definition

Definition

A Markov decision process is a 4-tuple, where

  • is a finite set of states,
  • is a finite set of actions (alternatively, is the finite set of actions available from state ),
  • is the probability that action in state at time will lead to state at time ,
  • is the immediate reward (or expected immediate reward) received after transition to state from state with transition probability .

(The theory of Markov decision processes does not actually require or to be finite, but the basic algorithms below assume that they are finite.)

Read more about this topic:  Markov Decision Process

Famous quotes containing the word definition:

    It’s a rare parent who can see his or her child clearly and objectively. At a school board meeting I attended . . . the only definition of a gifted child on which everyone in the audience could agree was “mine.”
    Jane Adams (20th century)

    It is very hard to give a just definition of love. The most we can say of it is this: that in the soul, it is a desire to rule; in the spirit, it is a sympathy; and in the body, it is but a hidden and subtle desire to possess—after many mysteries—what one loves.
    François, Duc De La Rochefoucauld (1613–1680)

    No man, not even a doctor, ever gives any other definition of what a nurse should be than this—”devoted and obedient.” This definition would do just as well for a porter. It might even do for a horse. It would not do for a policeman.
    Florence Nightingale (1820–1910)