Eigenvalue Algorithm - Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors

Given an n × n square matrix A of real or complex numbers, an eigenvalue λ and its associated generalized eigenvector v are a pair obeying the relation

where v is a nonzero n × 1 column vector, I is the n × n identity matrix, k is a positive integer, and both λ and v are allowed to be complex even when A is real. When k = 1, the vector is called simply an eigenvector, and the pair is called an eigenpair. In this case, Av = λv. Any eigenvalue λ of A has ordinary eigenvectors associated to it, for if k is the smallest integer such that (A - λI)k v = 0 for a generalized eigenvector v, then (A - λI)k-1 v is an ordinary eigenvector. The value k can always be taken as less than or equal to n. In particular, (A - λI)n v = 0 for all generalized eigenvectors v associated with λ.


For each eigenvalue λ of A, the kernel ker(A - λI) consists of all eigenvectors associated with λ (along with 0), called the eigenspace of λ, while the vector space ker((A - λI)n) consists of all generalized eigenvectors, and is called the generalized eigenspace. The geometric multiplicity of λ is the dimension of its eigenspace. The algebraic multiplicity of λ is the dimension of its generalized eigenspace. The latter terminology is justified by the equation

where det is the determinant function, the λi are all the distinct eigenvalues of A and the αi are the corresponding algebraic multiplicities. The function pA(z) is the characteristic polynomial of A. So the algebraic multiplicity is the multiplicity of the eigenvalue as a zero of the characteristic polynomial. Since any eigenvector is also a generalized eigenvector, the geometric multiplicity is less than or equal to the algebraic multiplicity. The algebraic multiplicities sum up to n, the degree of the characteristic polynomial. The equation pA(z) = 0 is called the characteristic equation, as its roots are exactly the eigenvalues of A. By the Cayley-Hamilton theorem, A itself obeys the same equation: pA(A) = 0. As a consequence, the columns of the matrix must be either 0 or generalized eigenvectors of the eigenvalue λj, since they are annihilated by In fact, the column space is the generalized eigenspace of λj.


Any collection of generalized eigenvectors of distinct eigenvalues is linearly independent, so a basis for all of C n can be chosen consisting of generalized eigenvectors. More particularly, this basis {vi}n
i=1 can be chosen and organized so that

  • if vi and vj have the same eigenvalue, then so does vk for each k between i and j, and
  • if vi is not an ordinary eigenvector, and if λi is its eigenvalue, then (A - λiI )vi = vi-1 (in particular, v1 must be an ordinary eigenvector).

If these basis vectors are placed as the column vectors of a matrix V =, then V can be used to convert A to its Jordan normal form:

where the λi are the eigenvalues, βi = 1 if (A - λi+1)vi+1 = vi and βi = 0 otherwise.

More generally, if W is any invertible matrix, and λ is an eigenvalue of A with generalized eigenvector v, then (W -1AW - λI )k W -kv = 0. Thus λ is an eigenvalue of W -1AW with generalized eigenvector W -kv. That is, similar matrices have the same eigenvalues.

Read more about this topic:  Eigenvalue Algorithm