Eigenvalue Algorithm - Hessenberg and Tri-diagonal Matrices

Hessenberg and Tri-diagonal Matrices

Because the eigenvalues of a triangular matrix are its diagonal elements, for general matrices there is no finite method like gaussian elimination to convert a matrix to triangular form while preserving eigenvalues. But it is possible to reach something close to triangular. An upper Hessenberg matrix is a square matrix for which all entries below the subdiagonal are zero. A lower Hessenberg matrix is one for which all entries above the superdiagonal are zero. Matrices that are both upper and lower Hessenberg are tridiagonal. Hessenberg and tridiagonal matrices are the starting points for many eigenvalue algorithms because the zero entries reduce the complexity of the problem. Several methods are commonly used to convert a general matrix into a Hessenberg matrix with the same eigenvalues. If the original matrix was symmetric or hermitian, then the resulting matrix will be tridiagonal.


When only eigenvalues are needed, there is no need to calculate the similarity matrix, as the transformed matrix has the same eigenvalues. If eigenvectors are needed as well, the similarity matrix may be needed to transform the eigenvectors of the Hessenberg matrix back into eigenvectors of the original matrix.

Method Applies to Produces Cost without similarity matrix Cost with similarity matrix Description
Householder transformations General Hessenberg 2n3⁄3 + O(n2) 4n3⁄3 + O(n2) Reflect each column through a subspace to zero out its lower entries.
Givens rotations General Hessenberg 4n3⁄3 + O(n2) Apply planar rotations to zero out individual entries. Rotations are ordered so that later ones do not cause zero entries to become non-zero again.
Arnoldi iteration General Hessenberg Perform Gram–Schmidt orthogonalization on Krylov subspaces.
Lanczos algorithm Hermitian Tridiagonal Arnoldi iteration for hermitian matrices, with shortcuts.

Read more about this topic:  Eigenvalue Algorithm