For a square matrix, eigenvalues are scalars that satisfies the equation
and those are eigenvectors.
Geometrically, a vector is an eigenvector of the linear transformation with eigenvalue if stretches by a factor of .
The prefix eigen- comes from the German word “eigen,” meaning ‘proper,’ ‘characteristic,’ or ‘own.’1 Consequently, eigenvectors are also referred to as characteristic vectors, while eigenvalues can be called characteristic values or characteristic roots.
Solutions to Eigenvalues
To find solution for eigenvalues, we can rewrite the equation in the form of
It follows that the matrix must be singular, and its determinant must be 0.
In other word, a number will be an eigenvalue of a square matrix if and only if is a root of the characteristic polynomial
Proof: For and to be similar, there must be a matrix such that . Then
Since the characteristic polynomials for similar matrices are the same, the eigenvalues must also be the same.
Corollary: The eigenvalues for similar matrices are equal
However, having equal eigenvalues does not force matrices to be similar.
Eigenvalues and Determinant
Theorem
Let be the eigenvalues, count with multiplicity, of a matrix . Then
We need to “count with multiplicity” because a polynomial can have a root that must be counted more than once (i.e. has the single root which we want to count twice)
if is an eigenvalue of , then must be an eigenvalue of
there is a one-to-one mapping between ‘s eigenvalue and ‘s eigenvalue
An important application of this is the spectral radius (maximum of the absolute values of the eigenvalues of a matrix). (Spectral radius is useful to see whether an algorithm converges)