Model Setup
Consider a standard linear regression problem, in which for we specify the conditional distribution of given a predictor vector :
where is a vector, and the is independent and identical normally distributed random variables:
This corresponds to the following likelihood function:
The ordinary least squares solution is to estimate the coefficient vector using the Moore-Penrose pseudoinverse:
where is the design matrix, each row of which is a predictor vector ; and is the column -vector .
This is a frequentist approach, and it assumes that there are enough measurements to say something meaningful about . In the Bayesian approach, the data are supplemented with additional information in the form of a prior probability distribution. The prior belief about the parameters is combined with the data's likelihood function according to Bayes theorem to yield the posterior belief about the parameters and . The prior can take different functional forms depending on the domain and the information that is available a priori.
Read more about this topic: Bayesian Linear Regression
Famous quotes containing the word model:
“It has to be acknowledged that in capitalist society, with its herds of hippies, originality has become a sort of fringe benefit, a mere convention, accepted obsolescence, the Beatnik model being turned in for the Hippie model, as though strangely obedient to capitalist laws of marketing.”
—Mary McCarthy (19121989)