Least Mean Squares Filter - Convergence and Stability in The Mean

Convergence and Stability in The Mean

As the LMS algorithm does not use the exact values of the expectations, the weights would never reach the optimal weights in the absolute sense, but a convergence is possible in mean. That is even-though, the weights may change by small amounts, it changes about the optimal weights. However, if the variance with which the weights change, is large, convergence in mean would be misleading. This problem may occur, if the value of step-size is not chosen properly.

If is chosen to be large, the amount with which the weights change depends heavily on the gradient estimate, and so the weights may change by a large value so that gradient which was negative at the first instant may now become positive. And at the second instant, the weight may change in the opposite direction by a large amount because of the negative gradient and would thus keep oscillating with a large variance about the optimal weights. On the other hand if is chosen to be too small, time to converge to the optimal weights will be too large.

Thus, an upper bound on is needed which is given as

where is the greatest eigenvalue of the autocorrelation matrix . If this condition is not fulfilled, the algorithm becomes unstable and diverges.

Maximum convergence speed is achieved when


\mu=\frac{2}{\lambda_{\mathrm{max}}+\lambda_{\mathrm{min}}},

where is the smallest eigenvalue of R. Given that is less than or equal to this optimum, the convergence speed is determined by, with a larger value yielding faster convergence. This means that faster convergence can be achieved when is close to, that is, the maximum achievable convergence speed depends on the eigenvalue spread of .

A white noise signal has autocorrelation matrix where is the variance of the signal. In this case all eigenvalues are equal, and the eigenvalue spread is the minimum over all possible matrices. The common interpretation of this result is therefore that the LMS converges quickly for white input signals, and slowly for colored input signals, such as processes with low-pass or high-pass characteristics.

It is important to note that the above upperbound on only enforces stability in the mean, but the coefficients of can still grow infinitely large, i.e. divergence of the coefficients is still possible. A more practical bound is


0<\mu<\frac{2}{\mathrm{tr}\left},

where denotes the trace of . This bound guarantees that the coefficients of do not diverge (in practice, the value of should not be chosen close to this upper bound, since it is somewhat optimistic due to approximations and assumptions made in the derivation of the bound).

Read more about this topic:  Least Mean Squares Filter

Famous quotes containing the word stability:

    Every nation ... whose affairs betray a want of wisdom and stability may calculate on every loss which can be sustained from the more systematic policy of its wiser neighbors.
    James Madison (1751–1836)