Likelihood Function of A Parameterized Model
Among many applications, we consider here one of broad theoretical and practical importance. Given a parameterized family of probability density functions (or probability mass functions in the case of discrete distributions)
where θ is the parameter, the likelihood function is
written
where x is the observed outcome of an experiment. In other words, when f(x | θ) is viewed as a function of x with θ fixed, it is a probability density function, and when viewed as a function of θ with x fixed, it is a likelihood function.
Note: This is not the same as the probability that those parameters are the right ones, given the observed sample. Attempting to interpret the likelihood of a hypothesis given observed evidence as the probability of the hypothesis is a common error, with potentially disastrous real-world consequences in medicine, engineering or jurisprudence. See prosecutor's fallacy for an example of this.
From a geometric standpoint, if we consider f (x, θ) as a function of two variables then the family of probability distributions can be viewed as a family of curves parallel to the x-axis, while the family of likelihood functions are the orthogonal curves parallel to the θ-axis.
Read more about this topic: Likelihood Function
Famous quotes containing the words likelihood, function and/or model:
“What likelihood is there of corrupting a man who has no ambition?”
—Samuel Richardson (16891761)
“The function of the actor is to make the audience imagine for the moment that real things are happening to real people.”
—George Bernard Shaw (18561950)
“The playing adult steps sideward into another reality; the playing child advances forward to new stages of mastery....Childs play is the infantile form of the human ability to deal with experience by creating model situations and to master reality by experiment and planning.”
—Erik H. Erikson (20th century)