In probability theory and statistics, the Bernoulli distribution, named after Swiss scientist Jacob Bernoulli, is a discrete probability distribution, which takes value 1 with success probability and value 0 with failure probability . So if X is a random variable with this distribution, we have:
A classical example of a Bernoulli experiment is a single toss of a coin. The coin might come up heads with probability p and tails with probability 1-p. The experiment is called fair if p=0.5, indicating the origin of the terminology in betting (the bet is fair if both possible outcomes have the same probability).
The probability mass function f of this distribution is
This can also be expressed as
The expected value of a Bernoulli random variable X is, and its variance is
The above can be derived from the Bernoulli distribution as a special case of the Binomial distribution.
The kurtosis goes to infinity for high and low values of p, but for the Bernoulli distribution has a lower kurtosis than any other probability distribution, namely -2.
The Bernoulli distribution is a member of the exponential family.
The maximum likelihood estimator of p based on a random sample is the sample mean.
Read more about Bernoulli Distribution: Related Distributions
Famous quotes containing the word distribution:
“The question for the country now is how to secure a more equal distribution of property among the people. There can be no republican institutions with vast masses of property permanently in a few hands, and large masses of voters without property.... Let no man get by inheritance, or by will, more than will produce at four per cent interest an income ... of fifteen thousand dollars] per year, or an estate of five hundred thousand dollars.”
—Rutherford Birchard Hayes (18221893)