Eigenvalues And Eigenvectors
An eigenvector of a square matrix is a non-zero vector that, when multiplied by the matrix, yields a vector that differs from the original at most by a multiplicative scalar.
For example, if three-element vectors are seen as arrows in three-dimensional space, an eigenvector of a 3×3 matrix A is an arrow whose direction is either preserved or exactly reversed after multiplication by A. The corresponding eigenvalue determines how the length and sense of the arrow is changed by the operation.
Specifically, a non-zero column vector v is a right eigenvector of a matrix A if (and only if) there exists a number λ such that Av = λv. If the vector satisfies vA = λv instead, it is said to be a left eigenvector. The number λ is called the eigenvalue corresponding to that vector. The set of all eigenvectors of a matrix, each paired with its corresponding eigenvalue, is called the eigensystem of that matrix.
An eigenspace of A is the set of all eigenvectors with the same eigenvalue, together with the zero vector.
The terms characteristic vector, characteristic value, and characteristic space are also used for these concepts. The prefix eigen- is adopted from the German word eigen for "self". Having an eigenvalue is an accidental property of a real matrix (since it may fail to have an eigenvalue), but every complex matrix has an eigenvalue.
These concepts are naturally extended to more general situations, where the set of real scale factors is replaced by any field of scalars (such as algebraic or complex numbers); the set of Cartesian vectors is replaced by any vector space (such as the continuous functions, the polynomials or the trigonometric series), and matrix multiplication is replaced by any linear operator that maps vectors to vectors (such as the derivative from calculus). In such cases, the concept of "parallel to" is interpreted as "scalar multiple of", and the "vector" in "eigenvector" may be replaced by a more specific term, as in "eigenfunction", "eigenmode", "eigenface", "eigenstate", and "eigenfrequency". Thus, for example, the exponential function is an eigenfunction of the derivative operator " ", with eigenvalue, since its derivative is .
Eigenvalues and eigenvectors have many applications in both pure and applied mathematics. They are used in matrix factorization, in quantum mechanics, and in many other areas.
Other articles related to "eigenvalues and eigenvectors, eigenvector, eigenvalue, eigenvalues, eigenvectors":
... See Basic reproduction number The basic reproduction number is a fundamental number in the study of how infectious diseases spread ... If one infectious person is put into a population of completely susceptible people, then is the average number of people that one infectious person will infect ...
... T and v is called a characteristic vector or eigenvector ... λv is called a characteristic value or eigenvalue of T ... To find an eigenvector or an eigenvalue, we note that where Id is the identity matrix ...
... The Courant minimax principle gives a condition for finding the eigenvalues for a real symmetric matrix ... The Courant minimax principle is as follows Notice that the vector x is an eigenvector to the corresponding eigenvalue λ ... q(x) = , A being a real symmetric matrix, the largest eigenvalue is given by λ1 = max
... A number λ and a non-zero vector v satisfying Av = λv are called an eigenvalue and an eigenvector of A, respectively ... The number λ is an eigenvalue of an n×n-matrix A if and only if A−λIn is not invertible, which is equivalent to The polynomial pA in an indeterminate X ... eigenvalues of the matrix ...
... Eigenvectors of distinct eigenvalues of a normal matrix are orthogonal ... For any normal matrix A, C n has an orthonormal basis consisting of eigenvectors of A ... The corresponding matrix of eigenvectors is unitary ...