Orthogonality Principle - A Solution To Error Minimization Problems

A Solution To Error Minimization Problems

The following is one way to find the minimum mean square error estimator by using the orthogonality principle.

We want to be able to approximate a vector by

where

is the approximation of as a linear combination of vectors in the subspace spanned by Therefore, we want to be able to solve for the coefficients, so that we may write our approximation in known terms.

By the orthogonality theorem, the square norm of the error vector, is minimized when, for all j,

Developing this equation, we obtain


\left\langle x,p_{j}\right\rangle =\left\langle \sum_i c_{i}p_{i},p_{j}\right\rangle =\sum_i c_{i}\left\langle p_{i},p_{j}\right\rangle.

If there is a finite number of vectors, one can write this equation in matrix form as


\begin{bmatrix}
\left\langle x,p_{1}\right\rangle \\
\left\langle x,p_{2}\right\rangle \\
\vdots\\
\left\langle x,p_{n}\right\rangle \end{bmatrix}
=
\begin{bmatrix}
\left\langle p_{1},p_{1}\right\rangle & \left\langle p_{2},p_{1}\right\rangle & \cdots & \left\langle p_{n},p_{1}\right\rangle \\
\left\langle p_{1},p_{2}\right\rangle & \left\langle p_{2},p_{2}\right\rangle & \cdots & \left\langle p_{n},p_{2}\right\rangle \\
\vdots & \vdots & \ddots & \vdots\\
\left\langle p_{1},p_{n}\right\rangle & \left\langle p_{2},p_{n}\right\rangle & \cdots & \left\langle p_{n},p_{n}\right\rangle \end{bmatrix}
\begin{bmatrix}
c_{1}\\
c_{2}\\
\vdots\\
c_{n}\end{bmatrix}.

Assuming the are linearly independent, the Gramian matrix can be inverted to obtain

\begin{bmatrix}
c_{1}\\
c_{2}\\
\vdots\\
c_{n}\end{bmatrix}
=
\begin{bmatrix}
\left\langle p_{1},p_{1}\right\rangle & \left\langle p_{2},p_{1}\right\rangle & \cdots & \left\langle p_{n},p_{1}\right\rangle \\
\left\langle p_{1},p_{2}\right\rangle & \left\langle p_{2},p_{2}\right\rangle & \cdots & \left\langle p_{n},p_{2}\right\rangle \\
\vdots & \vdots & \ddots & \vdots\\
\left\langle p_{1},p_{n}\right\rangle & \left\langle p_{2},p_{n}\right\rangle & \cdots & \left\langle p_{n},p_{n}\right\rangle \end{bmatrix}^{-1}
\begin{bmatrix}
\left\langle x,p_{1}\right\rangle \\
\left\langle x,p_{2}\right\rangle \\
\vdots\\
\left\langle x,p_{n}\right\rangle \end{bmatrix},

thus providing an expression for the coefficients of the minimum mean square error estimator.

Read more about this topic:  Orthogonality Principle

Famous quotes containing the words solution, error and/or problems:

    The Settlement ... is an experimental effort to aid in the solution of the social and industrial problems which are engendered by the modern conditions of life in a great city. It insists that these problems are not confined to any one portion of the city. It is an attempt to relieve, at the same time, the overaccumulation at one end of society and the destitution at the other ...
    Jane Addams (1860–1935)

    In Pride, in reas’ning Pride, our error lies;
    All quit their sphere, and rush into the skies.
    Pride still is aiming at the blest abodes,
    Men would be Angels, Angels would be Gods.
    Alexander Pope (1688–1744)

    I am always glad to think that my education was, for the most part, informal, and had not the slightest reference to a future business career. It left me free and untrammeled to approach my business problems without the limiting influence of specific training.
    Alice Foote MacDougall (1867–1945)