Linear Model
Suppose the data consists of n observations { y
i, x
i }n
i=1. Each observation includes a scalar response yi and a vector of predictors (or regressors) xi. In a linear regression model the response variable is a linear function of the regressors:
where β is a p×1 vector of unknown parameters; εi's are unobserved scalar random variables (errors) which account for the discrepancy between the actually observed responses yi and the "predicted outcomes" x′iβ; and ′ denotes matrix transpose, so that x′ β is the dot product between the vectors x and β. This model can also be written in matrix notation as
where y and ε are n×1 vectors, and X is an n×p matrix of regressors, which is also sometimes called the design matrix.
As a rule, the constant term is always included in the set of regressors X, say, by taking xi1 = 1 for all i = 1, …, n. The coefficient β1 corresponding to this regressor is called the intercept.
There may be some relationship between the regressors. For instance, the third regressor may be the square of the second regressor. In this case (assuming that the first regressor is constant) we have a quadratic model in the second regressor. But this is still considered a linear model because it is linear in the βs.
Read more about this topic: Ordinary Least Squares
Famous quotes containing the word model:
“I had a wonderful job. I worked for a big model agency in Manhattan.... When I got on the subway to go to work, it was like traveling into another world. Oh, the shops were beautiful, we had Bergdorfs, Bendels, Bonwits, DePinna. The women wore hats and gloves. Another world. At home, it was cooking, cleaning, taking care of the kids, going to PTA, Girl Scouts. But when I got into the office, everything was different, I was different.”
—Estelle Shuster (b. c. 1923)