Stochastic Control

Stochastic control or stochastic optimal control is a subfield of control theory that deals with the existence of uncertainty either in observations of the data or in the things that drive the evolution of the data. The designer assumes, in a Bayesian probability-driven fashion, that random noise with known probability distribution affects the evolution and observation of the state variables. Stochastic control aims to design the time path of the controlled variables that performs the desired control task with minimum cost, somehow defined, despite the presence of this noise. The context may be either discrete time or continuous time.

Read more about Stochastic Control:  Certainty Equivalence, Discrete Time, Continuous Time, In Finance

Famous quotes containing the word control:

    We long for our father. We wear his clothes, and actually try to fill his shoes. . . . We hang on to him, begging him to teach us how to do whatever is masculine, to throw balls or be in the woods or go see where he works. . . . We want our fathers to protect us from coming too completely under the control of our mothers. . . . We want to be seen with Dad, hanging out with men and doing men things.
    Frank Pittman (20th century)