Likelihood-ratio Test - Simple-versus-simple Hypotheses

Simple-versus-simple Hypotheses

A statistical model is often a parametrized family of probability density functions or probability mass functions . A simple-vs-simple hypotheses test has completely specified models under both the null and alternative hypotheses, which for convenience are written in terms of fixed values of a notional parameter :


\begin{align}
H_0 &:& \theta=\theta_0 ,\\
H_1 &:& \theta=\theta_1 .
\end{align}

Note that under either hypothesis, the distribution of the data is fully specified; there are no unknown parameters to estimate. The likelihood ratio test statistic can be written as:


\Lambda(x) = \frac{ L(\theta_0|x) }{ L(\theta_1|x) } = \frac{ f(x|\theta_0) }{ f(x|\theta_1) }

or

where is the likelihood function. Note that some references may use the reciprocal as the definition. In the form stated here, the likelihood ratio is small if the alternative model is better than the null model and the likelihood ratio test provides the decision rule as:

If, do not reject ;
If, reject ;
Reject with probability if

The values are usually chosen to obtain a specified significance level, through the relation: . The Neyman-Pearson lemma states that this likelihood ratio test is the most powerful among all level- tests for this problem.

Read more about this topic:  Likelihood-ratio Test

Famous quotes containing the word hypotheses:

    But don’t despise error. When touched by genius, when led by chance, the most superior truth can come into being from even the most foolish error. The important inventions which have been brought about in every realm of science from false hypotheses number in the hundreds, indeed in the thousands.
    Stefan Zweig (18811942)