**Linear Model**

Suppose the data consists of *n* observations { *y
i, x
i* }

*n*

*i*=1. Each observation includes a scalar response

*y*and a vector of predictors (or regressors)

_{i}*x*. In a linear regression model the response variable is a linear function of the regressors:

_{i}where *β* is a *p×*1 vector of unknown parameters; *ε _{i}'*s are unobserved scalar random variables (errors) which account for the discrepancy between the actually observed responses

*y*and the "predicted outcomes"

_{i}*x′*; and ′ denotes matrix transpose, so that

_{i}β*x′ β*is the dot product between the vectors

*x*and

*β*. This model can also be written in matrix notation as

where *y* and *ε* are *n×*1 vectors, and *X* is an *n×p* matrix of regressors, which is also sometimes called the design matrix.

As a rule, the constant term is always included in the set of regressors *X*, say, by taking *x*_{i1} = 1 for all *i* = 1, …, *n*. The coefficient *β*_{1} corresponding to this regressor is called the *intercept*.

There may be some relationship between the regressors. For instance, the third regressor may be the square of the second regressor. In this case (assuming that the first regressor is constant) we have a quadratic model in the second regressor. But this is still considered a linear model because it is linear in the *β*s.

Read more about this topic: Ordinary Least Squares

### Famous quotes containing the word model:

“The playing adult steps sideward into another reality; the playing child advances forward to new stages of mastery....Child’s play is the infantile form of the human ability to deal with experience by creating *model* situations and to master reality by experiment and planning.”

—Erik H. Erikson (20th century)