Linear Least Squares (mathematics) - Motivational Example

Motivational Example

See also: Polynomial regression

As a result of an experiment, four data points were obtained, and (shown in red in the picture on the right). It is desired to find a line that fits "best" these four points. In other words, we would like to find the numbers and that approximately solve the overdetermined linear system

\begin{alignat}{3}
\beta_1 + 1\beta_2 &&\; = \;&& 6 & \\
\beta_1 + 2\beta_2 &&\; = \;&& 5 & \\
\beta_1 + 3\beta_2 &&\; = \;&& 7 & \\
\beta_1 + 4\beta_2 &&\; = \;&& 10 & \\
\end{alignat}

of four equations in two unknowns in some "best" sense.

The least squares approach to solving this problem is to try to make as small as possible the sum of squares of "errors" between the right- and left-hand sides of these equations, that is, to find the minimum of the function

\begin{align}S(\beta_1, \beta_2) =& \left^2
+\left^2 \\
&+\left^2
+\left^2.\end{align}

The minimum is determined by calculating the partial derivatives of with respect to and and setting them to zero. This results in a system of two equations in two unknowns, called the normal equations, which give, when solved

and the equation of the line of best fit. The residuals, that is, the discrepancies between the values from the experiment and the values calculated using the line of best fit are then found to be and (see the picture on the right). The minimum value of the sum of squares is

Read more about this topic:  Linear Least Squares (mathematics)