Why is the determinant necessary to find out the eigenvalues of a matrix?

1.8k Views Asked by At

Say I have a $2\times2$ matrix $A$:

$$A = \begin{bmatrix}1&2\\4&3\\ \end{bmatrix}.$$

To find the eigenvalues, I have to solve

$$Au = \lambda u,$$ where $u$ is a non-zero vector. Solving this I get

$$0 = \lambda u -Au \Leftrightarrow \\ 0 = (\lambda*I_n -A)u.$$

Since $u$ is non-zero, $(\lambda I_n-A) = 0$. Why can't I then find the values of $\lambda$ for which this yields the null matrix? Why do I have to do $$\det(\lambda I_n -A)=0$$

instead?

I think that to get a null vector you don't have to multiply it by a null matrix necessarily, so I figure it has something to do with that, but I don't understand why I have to use the determinant.

6

There are 6 best solutions below

2
On BEST ANSWER

You partially answered your own question in saying that "I think that to get a null-vector you don't have to multiply it by a null-matrix necessarily".

The other part is that you don't need to use the determinant. If you find an eigenvalue however you do, that's good. You can try to solve $Av = \lambda v$, you can do it by inspection, you can do it by heavenly inspiration (provided you check, ha!).

The fact of matter is, that for an $n\times n$ matrix $A$, $\lambda$ is an eigenvalue of $A$ if and only if $\det(\lambda I - A) = 0$.
This is because if an $n\times n$ matrix $M$ has a nonzero vector $v$ in its kernel, then that $v$ is an eigenvector associated to the eigenvalue $0$. It follows that $\det M$, as the product of $M$'s eigenvalues, is $0$.
The thing is, this also goes in reverse: if $\det M = 0$, then some eigenvalue must be $0$, and so there must be nonzero vector in $M$'s kernel.

0
On

You wanted to say $(\lambda I_n - A)u=0$ and $u\ne 0 \implies \lambda I_n-A=0$,

but that's not true, as the example $\pmatrix{2 & 2\\4 & 4}\pmatrix{1\\-1}=\pmatrix{0\\0}$, where $\lambda=-1$, shows.

What is true is that $\det(\lambda I_n-A)=0$.

2
On

You are not after the null matrix. The eigenvalues of $A$ are $5$ and $-1$ but neither $A-5\operatorname{Id}$ nor $A+\operatorname{Id}$ are the null matrix. The numbers $5$ and $-1$ are the numbers $\lambda$ for which the equation $(A-\lambda\operatorname{Id}).\left[\begin{smallmatrix}x\\y\end{smallmatrix}\right]=\left[\begin{smallmatrix}0\\0\end{smallmatrix}\right]$ has non-null solutions. And those $\lambda$'s are precisely the solutions of the equation $\det(A-\lambda\operatorname{Id})=0$.

2
On

Actually, for numerical work with matrices much bigger than $2 \times 2$, you do not want to compute or solve the characteristic polynomial. It is numerically unstable. There are much more efficient and stable numerical methods, e.g. the QR algorithm.

0
On

Answering this question is actually quite simple. Putting aside determinant for a moment, lets focus on solving homogeneous systems. Let $B\in\mathbb{R}^{n\times n},x\in\mathbb{R}^n$, the following homogenous system has a trivial solution $$ Bx=0 $$ If and only if the row reduced echolen form (rref) of $B$ is the identity matrix $I$, right?. In other words, the rank of $B$ is full (i.e. rank($B$) = $n$). For nontrivial solutions, we need $\text{rank}(B) < n$. This case occurs only if $B$ is singular (i.e. $\text{rref}(B) \nrightarrow I $). Now Let's return to the eigenvalues problem. We need to find the nontrivial solutions for the following homogenous system

$$ \underbrace{(\lambda I - A)}_{B} x = 0. $$
The matrix $(\lambda I - A)$ must be singular. "One way" to check if the square matrix is singular is $\det(B) = 0$; therefore, we have $$ \det(\lambda I - A) = 0. $$

0
On

We want $(\lambda∗I_n−A)u = 0$ with $u \neq 0$. If we assume $(\lambda∗I_n−A)$ is invertible, then $u = (\lambda∗I_n−A)^{-1}*0 = 0$ which goes against our requirement for $u \neq 0$. Hence, we want $(\lambda∗I_n−A)$ to be a non-invertible matrix. A square matrix is non-invertible iff it's determinant is $0$. This leads us to the conclusion that $det((\lambda∗I_n−A)) = 0$ for non-zero eigenvectors.