Bernoulli Distribution

In probability theory and statistics, the Bernoulli distribution, named after Swiss scientist Jacob Bernoulli, is a discrete probability distribution, which takes value 1 with success probability and value 0 with failure probability . So if X is a random variable with this distribution, we have:

A classical example of a Bernoulli experiment is a single toss of a coin. The coin might come up heads with probability p and tails with probability 1-p. The experiment is called fair if p=0.5, indicating the origin of the terminology in betting (the bet is fair if both possible outcomes have the same probability).

The probability mass function f of this distribution is

 f(k;p) = \begin{cases} p & \text{if }k=1, \\
1-p & \text {if }k=0.\end{cases}

This can also be expressed as

The expected value of a Bernoulli random variable X is, and its variance is

The above can be derived from the Bernoulli distribution as a special case of the Binomial distribution.

The kurtosis goes to infinity for high and low values of p, but for the Bernoulli distribution has a lower kurtosis than any other probability distribution, namely -2.

The Bernoulli distribution is a member of the exponential family.

The maximum likelihood estimator of p based on a random sample is the sample mean.

Read more about Bernoulli Distribution:  Related Distributions

Famous quotes containing the word distribution:

    There is the illusion of time, which is very deep; who has disposed of it? Mor come to the conviction that what seems the succession of thought is only the distribution of wholes into causal series.
    Ralph Waldo Emerson (1803–1882)