Show $\begin{bmatrix}d-\lambda\cr -c\end{bmatrix}$ is eigenvector of $\begin{bmatrix}a &b\cr c&d\end{bmatrix}$

75 Views Asked by At

Show $A=\begin{bmatrix}d-\lambda\cr -c\end{bmatrix}$ is eigenvector of $\begin{bmatrix}a &b\cr c&d\end{bmatrix}$

I first did $\operatorname{det}(A-\lambda I)=0$ and got $\lambda^2+(-a-d)\lambda+(ad-bc)=0$

To that end, I then tried doing the quadratic formula but that really didn't get me anywhere. I need to somehow get $\lambda$ values to get the eigenvectors but I can't even get it.

2

There are 2 best solutions below

2
On BEST ANSWER

$$\begin{bmatrix}a &b\cr c&d\end{bmatrix}\begin{bmatrix}d-\lambda\cr -c\end{bmatrix}=\begin{bmatrix}ad-\lambda a-bc\\cd-\lambda c-cd\end{bmatrix}=\begin{bmatrix}\lambda d-\lambda^2\\-\lambda c\end{bmatrix}=\lambda\begin{bmatrix}d-\lambda\\-c\end{bmatrix},$$ where the second equality follows using $\lambda^2+(-a-d)\lambda+(ad-bc)=0$.

Edit: Of course since an eigenvector must be non-zero, therefore this is only true when $d \neq \lambda$ and/or $c\neq 0.$

0
On

I guess you're missing a very important assumption, namely that $\lambda$ is supposed to be an eigenvalue of the matrix. Then the rank of $$ \begin{bmatrix} a-\lambda & b \\ c & d-\lambda \end{bmatrix} $$ is at most one. If the second row of the matrix is not zero, then you get an eigenvector by solving the equation $cx+(d-\lambda)y=0$, because the vector $\left[\begin{smallmatrix} x\\y\end{smallmatrix}\right]$ surely satisfies also $(a-\lambda)x+by=0$. A nonzero solution is certainly $$ \begin{bmatrix} d-\lambda \\ -c \end{bmatrix} $$ However the statement is wrong, in general and the assumption that $d-\lambda\ne0$ or $c\ne0$ is necessary. Consider the matrix $$ \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} $$ Then the only eigenvalue is $1$ and the assigned vector is $$ \begin{bmatrix} d-\lambda \\ -c \end{bmatrix} = \begin{bmatrix} 1 - 1 \\ -0 \end{bmatrix}=\begin{bmatrix} 0 \\ 0 \end{bmatrix} $$ which is not an eigenvector.

Another instance where this fails is when $a=b=c=d=0$.