Arnoldi Iteration - Krylov Subspaces and The Power Iteration

Krylov Subspaces and The Power Iteration

An intuitive method for finding an eigenvalue (specifically the largest eigenvalue) of a given m × m matrix is the power iteration. Starting with an initial random vector b, this method calculates Ab, A2b, A3b,… iteratively storing and normalizing the result into b on every turn. This sequence converges to the eigenvector corresponding to the largest eigenvalue, . However, much potentially useful computation is wasted by using only the final result, . This suggests that instead, we form the so-called Krylov matrix:

The columns of this matrix are not orthogonal, but in principle, we can extract an orthogonal basis, via a method such as Gram–Schmidt orthogonalization. The resulting vectors are a basis of the Krylov subspace, . We may expect the vectors of this basis to give good approximations of the eigenvectors corresponding to the largest eigenvalues, for the same reason that approximates the dominant eigenvector.

Read more about this topic:  Arnoldi Iteration

Famous quotes containing the word power:

    Of Heaven of Hell I have no power to sing,
    I cannot ease the burden of your fears,
    Or make quick-coming death a little thing,
    Or bring again the pleasure of past years,
    William Morris (1834–1896)