Stochastic Game - Theory

Theory

The ingredients of a stochastic game are: a finite set of players ; a state space (either a finite set or a measurable space ); for each player, an action set (either a finite set or a measurable space ); a transition probability from, where is the action profiles, to, where is the probability that the next state is in given the current state and the current action profile ; and a payoff function from to, where the -th coordinate of, is the payoff to player as a function of the state and the action profile .

The game starts at some initial state . At stage, players first observe, then simultaneously choose actions, then observe the action profile, and then nature selects according to the probability . A play of the stochastic game, defines a stream of payoffs, where .

The discounted game with discount factor is the game where the payoff to player is . The -stage game is the game where the payoff to player is .

The value, respectively, of a two-person zero-sum stochastic game, respectively, with finitely many states and actions exists, and Truman Bewley and Elon Kohlberg (1976) proved that converges to a limit as goes to infinity and that converges to the same limit as goes to .

The "undiscounted" game is the game where the payoff to player is the "limit" of the averages of the stage payoffs. Some precautions are needed in defining the value of a two-person zero-sum and in defining equilibrium payoffs of a non-zero-sum . The uniform value of a two-person zero-sum stochastic game exists if for every there is a positive integer and a strategy pair of player 1 and of player 2 such that for every and and every the expectation of with respect to the probability on plays defined by and is at least, and the expectation of with respect to the probability on plays defined by and is at most . Jean-François Mertens and Abraham Neyman (1981) proved that every two-person zero-sum stochastic game with finitely many states and actions has a uniform value.

If there is a finite number of players and the action sets and the set of states are finite, then a stochastic game with a finite number of stages always has a Nash equilibrium. The same is true for a game with infinitely many stages if the total payoff is the discounted sum. Nicolas Vieille has shown that all two-person stochastic games with finite state and action spaces have approximate Nash equilibria when the total payoff is the limit inferior of the averages of the stage payoffs. Whether such equilibria exist when there are more than two players is a challenging open question.

A Markov perfect equilibrium is a refinement of the concept of sub-game perfect Nash equilibrium to stochastic games..

Read more about this topic:  Stochastic Game

Famous quotes containing the word theory:

    ... the first reason for psychology’s failure to understand what people are and how they act, is that clinicians and psychiatrists, who are generally the theoreticians on these matters, have essentially made up myths without any evidence to support them; the second reason for psychology’s failure is that personality theory has looked for inner traits when it should have been looking for social context.
    Naomi Weisstein (b. 1939)

    Won’t this whole instinct matter bear revision?
    Won’t almost any theory bear revision?
    To err is human, not to, animal.
    Robert Frost (1874–1963)

    The things that will destroy America are prosperity-at-any- price, peace-at-any-price, safety-first instead of duty-first, the love of soft living, and the get-rich-quick theory of life.
    Theodore Roosevelt (1858–1919)