Likelihood Function - Example 2

Example 2

Consider a jar containing N lottery tickets numbered from 1 through N. If you pick a ticket randomly then you get positive integer n, with probability 1/N if nN and with probability zero if n > N. This can be written

where the Iverson bracket is 1 when nN and 0 otherwise. When considered a function of n for fixed N this is the probability distribution, but when considered a function of N for fixed n this is a likelihood function. The maximum likelihood estimate for N is N0 = n (by contrast, the unbiased estimate is 2n − 1).

This likelihood function is not a probability distribution, because the total

is a divergent series.

Suppose, however, that you pick two tickets rather than one.

The probability of the outcome {n1, n2}, where n1 < n2, is

When considered a function of N for fixed n2, this is a likelihood function. The maximum likelihood estimate for N is N0 = n2.

This time the total

\sum_{N=1}^\infty P(\{n_1,n_2\}|N)
= \sum_{N} \frac{}{\binom N 2}
=\frac 2 {n_2-1}

is a convergent series, and so this likelihood function can be normalized into a probability distribution.

If you pick 3 or more tickets, the likelihood function has a well defined mean value, which is larger than the maximum likelihood estimate. If you pick 4 or more tickets, the likelihood function has a well defined standard deviation too.

Read more about this topic:  Likelihood Function