In probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or the first moment) of a random variable is the weighted average of all possible values that this random variable can take on. The weights used in computing this average correspond to the probabilities in case of a discrete random variable, or densities in case of a continuous random variable. From a rigorous theoretical standpoint, the expected value is the integral of the random variable with respect to its probability measure.
The expected value may be intuitively understood by the law of large numbers: the expected value, when it exists, is almost surely the limit of the sample mean as sample size grows to infinity. More informally, it can be interpreted as the long-run average of the results of many independent repetitions of an experiment (e.g. a dice roll). The value may not be expected in the ordinary sense—the "expected value" itself may be unlikely or even impossible (such as having 2.5 children), just like the sample mean.
The expected value does not exist for some distributions with large "tails", such as the Cauchy distribution.
Read more about Expected Value: History, Uses and Applications, Expectation of Matrices
Famous quotes containing the word expected:
“Ancient history has an air of antiquity. It should be more modern. It is written as if the specator should be thinking of the backside of the picture on the wall, or as if the author expected that the dead would be his readers, and wished to detail to them their own experience.”
—Henry David Thoreau (18171862)