In probability theory and statistics, the Bernoulli distribution, named after Swiss scientist Jacob Bernoulli, is a discrete probability distribution, which takes value 1 with success probability and value 0 with failure probability . So if X is a random variable with this distribution, we have:
A classical example of a Bernoulli experiment is a single toss of a coin. The coin might come up heads with probability p and tails with probability 1-p. The experiment is called fair if p=0.5, indicating the origin of the terminology in betting (the bet is fair if both possible outcomes have the same probability).
The probability mass function f of this distribution is
This can also be expressed as
The expected value of a Bernoulli random variable X is, and its variance is
The above can be derived from the Bernoulli distribution as a special case of the Binomial distribution.
The kurtosis goes to infinity for high and low values of p, but for the Bernoulli distribution has a lower kurtosis than any other probability distribution, namely -2.
The Bernoulli distribution is a member of the exponential family.
The maximum likelihood estimator of p based on a random sample is the sample mean.
Read more about Bernoulli Distribution: Related Distributions
Famous quotes containing the word distribution:
“The man who pretends that the distribution of income in this country reflects the distribution of ability or character is an ignoramus. The man who says that it could by any possible political device be made to do so is an unpractical visionary. But the man who says that it ought to do so is something worse than an ignoramous and more disastrous than a visionary: he is, in the profoundest Scriptural sense of the word, a fool.”
—George Bernard Shaw (18561950)