$\det(A-\lambda I)$ or $\det(\lambda I-A)$. Which one to use?

9.9k Views Asked by At

How can I tell which function to use on which matrix to get the signs correct?


Read here for an explanation to the above question.

I am taking a linear algebra college class. We are learning about Eigenvalues where we have a square matrix:

$\begin{bmatrix}4&0&1\\-2&1&0\\-2&0&1\end{bmatrix}$

The textbook provides the function $\det(\lambda I-A)=0$ to find the characteristic function from which I can retrieve the eigenvalues. I used this on the matrix above but I ended up with a polynomial but with the opposite signs. EX: $\lambda^3 - 6\lambda^2 + 11\lambda - 6$ instead of $-\lambda^3 + 6\lambda^2 - 11\lambda + 6$ (I have been checking my answer with an online eigenvalue calculator).

I found a variation of this function on the Internet in the form of: $\det(A-\lambda I)=0$. Using this function give the same values as $\det(\lambda I-A)=0$ except the signs are opposite, and correct for this matrix.

How can I tell which function to use on which matrix to get the signs correct?

3

There are 3 best solutions below

0
On BEST ANSWER

Since $A-\lambda I=-(\lambda I-A)$, we have $$ \det(A-\lambda I)=\det\bigl(-(\lambda I-A)\bigr)= (-1)^n\det(\lambda I-A) $$ where $n$ is the number of rows of the matrix $A$. Thus the roots are the same and it's immaterial which one to use.

0
On

For an $n \times n$ matrix $A$, $$\det (A - \lambda I) = (-1)^n \det (\lambda I - A).$$ In particular while the two expressions differ by sign (for odd $n$), as polynomials in $\lambda$ they have the same roots: If $p(\lambda) = 0$, then $(-p)(\lambda) = 0$, too.

In practice I prefer to declare the characteristic polynomial to be $$p_A(\lambda) := \det(\lambda I - A)$$ simply because it is monic, i.e., has leading term $\lambda^n$ (rather than $-\lambda^n$ for odd $n$).

0
On

Yes it is immaterial which one to use for calculating the eigenvalues, but for checking, $A-\lambda I$ is much easier to work with. For example, with the provided matrix, it's not hard to see that $1$ is an eigenvalue, since $\begin{bmatrix} 4 & 0 & 1\\ -2 & 1 & 0\\ -2 & 0 & 1\end{bmatrix} - \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 3 & 0 & 1\\-2 & 0 & 0\\ -2 & 0 & 0\end{bmatrix}$ which is clearly singular since two rows are the same. Visually checking that final (singular) matrix against $A$, it is much easier to not have all the off-diagonal elements negated.

It is also clear here how finding an eigenvector is just getting an element in the null-space; that is, $\begin{pmatrix}0 \\ 1 \\ 0\end{pmatrix} $ is clearly an eigenvector in this example. It is not necessary to solve a set of linear equations to obtain this.

Similarly $2$ can be seen to be an eigenvalue; viz. $\begin{bmatrix} 2 & 0 & 1\\ -2 & -1 & 0\\ -2 & 0 & -1\end{bmatrix}$ with eigenvector $\begin{pmatrix}1\\-2\\-2\end{pmatrix}$ and also $3$: viz. $\begin{bmatrix} 1 & 0 & 1\\ -2 & -2 & 0\\ -2 & 0 & -2\end{bmatrix}$ with eigenvector $\begin{pmatrix}1\\-1\\-1\end{pmatrix}$. No expanding of determinants required; though with so many $0$s it's not hard to get $\ \det A = 6 $, which is the product of the eigenvalues: $1\times 2\times 3$.