e-Mathematics > Matrix Algebra

Distinct Eigenvalues

Theorem for distinct eigenvalues. Let $ A$ be an $ n\times n$ square matrix. If $ \lambda_1,\ldots,\lambda_k$ are distinct eigenvalues of $ A$ then the corresponding eigenvectors  $ \mathbf{v}_1, \ldots, \mathbf{v}_k$ are linearly independent.

Proof by contradiction. For this we assume that $ \mathbf{v}_1, \ldots, \mathbf{v}_k$ are linearly dependent, and draw a contradiction in the end. Then we can find the largest integer $p<k$ so that $ \mathbf{v}_1,\ldots,\mathbf{v}_p$ are linearly independent, but $ \mathbf{v}_1,\ldots,\mathbf{v}_p,\mathbf{v}_{p+1}$ are not. Thus, we should be able to find a nontrivial solution  $ x_1,\ldots,x_p,x_{p+1}$ to the homogeneous equation

$\displaystyle x_1 \mathbf{v}_{1} + \cdots + x_p \mathbf{v}_{p}
+ x_{p+1} \mathbf{v}_{p+1} = \mathbf{0}.
$

Proof, continued. By operating $ (A - \lambda_{p+1}I)$ on the above homogeneous equation, we can find

\begin{displaymath}
\begin{array}{l}
(\lambda_1 - \lambda_{p+1}) x_1 \mathbf{...
...lambda_{p+1}) x_p \mathbf{v}_{p}
= \mathbf{0},
\end{array}
\end{displaymath}

which implies a nontrivial solution for the homogeneous equation consisting of  $ \mathbf{v}_1,\ldots,\mathbf{v}_p$. This is a contradiction! Hence, $ \mathbf{v}_1, \ldots, \mathbf{v}_k$ must be linearly independent. $ \square$

Application for diagonalization. As a corollary to the previous theorem we can show the following: Suppose that $ A$ has $ n$ distinct eigenvalues  $ \lambda_1,\ldots,\lambda_n$. Then $ P = [\mathbf{v}_1 \cdots \mathbf{v}_n]$ is invertible, and therefore, $ A$ is diagonalizable.

EXAMPLE 5. Determine whether the matrix $ A = \begin{bmatrix}
5 &-8 & 1 \\
0 & 0 & 7 \\
0 & 0 &-2
\end{bmatrix}$ is diagonalizable or not. Is $ A$ invertible?

EXERCISE 6. Diagonalize each of the following matrices if possible.

  1. $ A = \begin{bmatrix}
5 & 1 \\
0 & 5
\end{bmatrix}$
  2. $ A = \begin{bmatrix}
4 & 2 & 2 \\
2 & 4 & 2 \\
2 & 2 & 4
\end{bmatrix}$


© TTU Mathematics