We have seen that it is not always possible to find a basis of eigenvectors of a linear map from a finite-dimensional vector space to itself. We recall that a square matrix has diagonal form or is a diagonal matrix if all -elements with are zero. We discuss how to determine whether this is the case.
A linear map , where is a finite-dimensional vector space, is called diagonalizable if has a basis such that is a diagonal matrix.
The matrix is called diagonalizable if is diagonalizable. If more precision is needed and is real (respectively, complex), we say that is diagonalizable over the real (respectively, complex) numbers.
The -matrix is diagonalizable if and only if it is conjugate to a diagonal matrix. This means that there is an invertible -matrix such that is diagonal. Thanks to theorem Basis transitions in terms of matrices, this is equivalent to the fact that has a diagonal matrix with respect to a suitable basis for .
The matrix is not diagonalizable. For, otherwise there would be numbers and such that is conjugate to . But then we would have
Substituting , an equality obtained from the first equation, into the second equation, we obtain the quadratic equation , which has the single solution . This implies , so , the identity matrix. Therefore, there is an invertible -matrix with . This contradicts the fact that the -element of is equal to .
It may happen that a square matrix with real elements is not diagonalizable if we view the matrix as a linear map on a real vector space, but it is diagonalizable if we view the matrix as a linear map on a complex vector space (a complexification of ).
A well-known example is the matrix with complex eigenvalues and .
This is why we speak of diagonalizability over the real numbers and diagonalizability over the complex numbers.
The following statement is easy to understand but very important:
Let be a vector space of finite dimension with basis . The following statements regarding a linear map are equivalent.
- is diagonalizable.
- is diagonalizable.
- The sum of the dimensions of the eigenspaces of over eigenvalues is equal to .
- There is an invertible -matrix such that is a diagonal matrix.
In this case, the columns of the matrix form a basis of or (according to being real or complex) consisting of eigenvectors of .
This repeats what we discussed previously: it follows from the theorem Basis transitions in terms of matrices that there is a basis for with respect to which is diagonal, if and only if there is an invertible -matrix such that is diagonal. In statement 4 we take .
We saw that the matrix is not diagonalizable. This means that has no basis of eigenvectors of (in fact, not even has such a basis). The vector is an eigenvector with eigenvalue . Any other eigenvector of lies in the span of . This matrix is not even diagonalizable over the complex numbers.
If there is an invertible -matrix such that is a diagonal matrix, then may be found as a matrix whose columns are a basis of eigenvectors of . The determination of this basis can be carried out by means of a previously described procedure.
A direct consequence of theorem Recognizing diagonalizability is that, for a diagonalizable linear map , we are able to find a basis with respect to which the matrix of the map has the diagonal form by starting with an arbitrary basis and finding a matrix conjugating to a diagonal matrix, that is, such that is a diagonal matrix:
Let be a finite-dimensional vector space with basis and a linear mapping.
If is diagonalizable, then we can find an invertible matrix whose columns are eigenvectors of , such that is a diagonal matrix. Then, the composition is a coordinatization of such that is in diagonal form.
If is already a diagonal matrix, then where , suffices.
Suppose that is diagonalizable. According to the above argument, the matrix is diagonalizable. This means that there is an invertible matrix such that is a diagonal matrix. After multiplication of the two sides by , we find that for each , where is the standard basis of or (if is real or complex, respectively). On the left is the image of the vector under and on the right the scalar multiple of said vector with scalar , the -th diagonal element of . This shows that the -th column of is an eigenvector of with corresponding eigenvalue .
Because we conclude, thanks to theorem Basis transition that
If there appears to be a basis of eigenvectors, the setting up of the matrix with respect to such a basis is simple: the matrix is a diagonal matrix along whose diagonal the eigenvalues appear (in exactly the same order in which the corresponding eigenvectors appear in the basis). Thus, no explicit transformation is needed for this calculation.
The method can also be used to determine whether the linear map is diagonalizable.
For what value of is the -matrix below not diagonalizable over the complex numbers?
The matrix is not equal to a scalar multiple of the identity. Therefore is diagonalizable (over the complex numbers) if and only if it has two different eigenvalues (possibly complex). This is the case if and only if the
characteristic polynomial has two different (possibly complex) roots.
The characteristic polynomial is
The
discriminant of this quadratic polynomial is . Now has exactly one root (which must be real) if and only if . Solving this linear equation in gives .