Geometric Distribution - Parameter Estimation

Parameter Estimation

For both variants of the geometric distribution, the parameter p can be estimated by equating the expected value with the sample mean. This is the method of moments, which in this case happens to yield maximum likelihood estimates of p.

Specifically, for the first variant let k = k1, ..., kn be a sample where ki ≥ 1 for i = 1, ..., n. Then p can be estimated as

In Bayesian inference, the Beta distribution is the conjugate prior distribution for the parameter p. If this parameter is given a Beta(α, β) prior, then the posterior distribution is

The posterior mean E approaches the maximum likelihood estimate as α and β approach zero.

In the alternative case, let k1, ..., kn be a sample where ki ≥ 0 for i = 1, ..., n. Then p can be estimated as

The posterior distribution of p given a Beta(α, β) prior is

Again the posterior mean E approaches the maximum likelihood estimate as α and β approach zero.

Read more about this topic:  Geometric Distribution

Famous quotes containing the word estimation:

    No man ever stood lower in my estimation for having a patch in his clothes; yet I am sure that there is greater anxiety, commonly, to have fashionable, or at least clean and unpatched clothes, than to have a sound conscience.
    Henry David Thoreau (1817–1862)