What's the fastest way to determine Eigenvalues & Eigenvectors of any 2 by 2 Matrix?

213 Views Asked by At

My instructor claims that it's inefficient and superfluous to compute eigenvectors de novo for each $2$ by $2$ matrix. He suggested a trick instead which resembles the eigenvectors and cases here. He avers that once you find $\lambda$, then immediately conclude from $A - \lambda I = \begin{bmatrix} a-\lambda & b \\ c & d - \lambda \\ \end{bmatrix}$,
$1.$ that an eigenvector is always $\begin{bmatrix} \color{#FF4F00}{\LARGE{-}} b \\ a - \lambda \end{bmatrix} $ or its negative : $\begin{bmatrix} { b \\ \color{#FF4F00}{\LARGE{-}} ( a - \lambda ) } \end{bmatrix} $

$2.$ If the first row of $ A - \lambda I = \mathbf{ 0} $, then an eigenvector is always $\begin{bmatrix} \color{#FF4F00}{\LARGE{-}}( d - \lambda) \\ c \end{bmatrix} $ or the negative of this.

$3.$ If $ A - \lambda I = $ 0 matrix, then any vector is an eigenvector. I think this is the reason.

Informally and intuitively, would someone please explain/expound on his assertions? What about for eigenvalues or higher dimensions? No formal proofs or arguments please.

If so, why don't textbooks explain this? Strang adverts to it on P288 but only in 2 sentences.

2

There are 2 best solutions below

0
On

M LePressentiment: It's not so difficult! In order for $Bx=0$ to occur, $x$ must be orthogonal to the rows of $B$. And in $\Bbb R^2$, it's easy to check that vectors orthogonal to $(a,b)$ are scalar multiples of $(-b,a)$. (Geometrically, we're rotating $90^\circ$.)

4
On

The explaination of the trick your teacher used: For a matrix $$ \begin{pmatrix} a&b\\c&d \end{pmatrix} $$ we have $\lambda_1 + \lambda_2 = a+d$ and $\lambda_1\cdot\lambda_2 = ad-bc$. From these equations you can solve the eigenvalues. This is because the determinant and trace are the same in different bases.

Unfortunately this trick only works if the matrix is diagonalisable.

Idea for 1: Try to calculate $Av$ with these vectors. If this is $\lambda v$ for some constant $\lambda$ then his trick is correct.

Idea for 2: What he uses here is $$ Av=\lambda v\iff Av-\lambda v=0\iff v\in\ker(A-\lambda I) $$

Idea for 3: According to the equation above every vector $v$ in the kernel of $A-\lambda I$ is an eigenvector.