How to prove that $(b,\lambda-a)^T$ is the general formula for the eigenvector of a $2\times2$ matrix?

2.1k Views Asked by At

Let $$A = \begin{bmatrix}a&b\\c&d\end{bmatrix}$$ with eigenvalue $k$. Show that unless it is $0$, the vector $(b, k - a)^T$ is an eigenvector.

If $k$ is the only eigenvalue, then $b$, $c$ should both be $0$, and both $a, d = k$, so that $(b, k - a)^T = 0$, so that shouldn't be the case. Then there should be $2$ distinct eigenvalues. However, I can't find a way to prove that the vector is indeed a eigenvector. Since the first component is $b$, the eigenvalue should definitely be $k$. But then I have trouble with the $k - a$ part.

5

There are 5 best solutions below

0
On

If $k$ is an eigen value corresponding to the eigen vector $\mathbf{v}=\begin{bmatrix}b\\k-a\end{bmatrix}$, then

\begin{align*} A\mathbf{v}&=k\mathbf{v}\\ \begin{bmatrix} a & b\\ c & d \end{bmatrix}\begin{bmatrix}b\\k-a\end{bmatrix}&=k\begin{bmatrix}b\\k-a\end{bmatrix}\\ \begin{bmatrix}bk\\cb+dk-ad\end{bmatrix}&=k\begin{bmatrix}b\\k-a\end{bmatrix}\\ \end{align*} Since the two vectors are equal to each other, we get $$cb+dk-ad=k^2-ak \implies \color{red}{k^2-(a+d)k+(ad-bc)=0}.$$ But this holds true because the last equation is the characteristic equation of the given matrix and eigen values are roots of this equation.

If $ad-bc=0$ (i.e. the determinant is $0$), then $k=0$ will also be a solution to this problem.

0
On

Asserting that $k$ is an eigenvalue of $A$ is equivalent to $\det(A-k\operatorname{Id})=0$, which means that$$(a-k)(d-k)-bc=0(\iff ad-ak-dk+k^2=bc).\tag1$$On the other and$$\begin{bmatrix}a&b\\c&d\end{bmatrix}.\begin{bmatrix}b\\k-a\end{bmatrix}=k\begin{bmatrix}b\\k-a\end{bmatrix}\iff\left\{\begin{array}{l}ab+bk-ba=bk\\cb-dk-da=k^2-ak.\end{array}\right.$$The first equality is trivial and the second on holds, by $(1)$.

0
On

Since $k$ is an eigenvalue, we know that $$\det (A-kI)=0$$

That gives you $$k^2-k(a+d)+ad-bc=0$$

This is what you need for the second component of the eigenvector to work. It may look messy but works fine when you multiply your matrix by your eigenvector.

0
On

Since $k$ is an eigenvalue, $\dim\ker\left(\left[ \begin{array}{cc} a-k & b\\ c & d-k \end{array} \right]\right)>0$, thus $\det\left[ \begin{array}{cc} a-k & b\\ c & d-k \end{array} \right]=0$, then $$(k-a)(k-d)-bc=0\iff \color{blue}{bc}=(k-a)(k-d),$$ and $$\left[ \begin{array}{cc} a & b\\ c & d \end{array} \right]\left[ \begin{array}{c} b\\ k-a \end{array} \right]=\left[ \begin{array}{c} bk\\ \color{blue}{cb}+d(k-a) \end{array} \right]=\left[ \begin{array}{cc} bk\\ (k-d+d)(k-a) \end{array} \right]=k\left[ \begin{array}{cc} b\\ k-a \end{array} \right].$$

0
On

The key observation is that the vector $(v,-u)^\top$ is orthogonal to the vector $(u,v)^\top$. If $k$ is an eigenvalue of the matrix $A$, then the matrix $A-kI$ is singular, and so its rows are linearly dependent. What is the first row of $A-kI$? It is the vector $(a-k,b)^\top$. Any nonzero vector $\mathbf x$, then, satisfying $(A-kI)\mathbf x=0$ must be orthogonal to the rows of $A-kI$, and hence must be a scalar multiple of $(b,k-a)^\top$, as desired. Well, there is an exception: What if $(a-k,b)^\top = (0,0)^\top$? Then we need to go to the second row and use the vector $(k-d,c)^\top$.

When $k$ is a repeated eigenvalue, as you said, then $A=kI$ and every vector is an eigenvector. The criterion applies only to the case of an eigenvalue of algebraic multiplicity $1$.