Ordinary Least Squares - Linear Model

Linear Model

Suppose the data consists of n observations { y
i, x
i
 }n
i=1. Each observation includes a scalar response yi and a vector of predictors (or regressors) xi. In a linear regression model the response variable is a linear function of the regressors:

 y_i = x'_i\beta + \varepsilon_i, \,

where β is a 1 vector of unknown parameters; εi's are unobserved scalar random variables (errors) which account for the discrepancy between the actually observed responses yi and the "predicted outcomes" x′iβ; and ′ denotes matrix transpose, so that x′ β is the dot product between the vectors x and β. This model can also be written in matrix notation as

 y = X\beta + \varepsilon, \,

where y and ε are 1 vectors, and X is an n×p matrix of regressors, which is also sometimes called the design matrix.

As a rule, the constant term is always included in the set of regressors X, say, by taking xi1 = 1 for all i = 1, …, n. The coefficient β1 corresponding to this regressor is called the intercept.

There may be some relationship between the regressors. For instance, the third regressor may be the square of the second regressor. In this case (assuming that the first regressor is constant) we have a quadratic model in the second regressor. But this is still considered a linear model because it is linear in the βs.

Read more about this topic:  Ordinary Least Squares

Famous quotes containing the word model:

    I’d like to be the first model who becomes a woman.
    Lauren Hutton (b. 1944)