$2\times 2$ real matrix with exactly one eigenvalue

625 Views Asked by At

Problem: Let $A$ be a $2\times 2$ real matrix with exactly one eigenvalue $\lambda \in \mathbb{R}$, but that $A \not= \lambda I $, show that there exists an invertible matrix $P$ such that $$ P^{-1}AP = \pmatrix{\lambda&1\\0&\lambda}$$

What I have at most:

Firstly, if $\lambda \in \mathbb{R}$ is an eigenvalue, then $(A - \lambda I )\mathbf{v} = 0$ for a nonzero vector $\mathbf{v} \in \mathbb{R}^2$. And if $A - \lambda I \not = 0 $, this means the eigenspace of $\lambda$ is precisely of dimension $1$.

Therefore, exists $\mathbf{t}$ such that and $\mathbf{t}$ and $\mathbf{v}$ are linearly independent, and $A \mathbf{t} \not= \lambda \mathbf{t}$. So $A \mathbf{t} = \alpha \mathbf{v} + \beta \mathbf{t} $. where $\alpha \not= 0$ (since otherwise $\beta$ would be an eigenvalue). Giving, $$ A \frac{\mathbf{t}}{\alpha} = \mathbf{v} + \beta \frac{\mathbf{t}}{\alpha}$$

Thus, letting $$P = \big[ \mathbf {v} \quad \frac{1}{\alpha}\mathbf{t} \big]$$ where they are column vectors, yields

$$ P^{-1}AP = \pmatrix{\lambda&1\\0& \beta/\alpha}$$

What am I missing - may someone give a hint? Thank you!

edits: Thank you for the solutions. (Also made minor edits as in the comments.)

2

There are 2 best solutions below

0
On BEST ANSWER

Classifying all real $ 2 \times 2 $ matrices with respect to their minimal polynomials, we have the following three cases: (excluding complex eigenvalues)

  • Minimal polynomial of degree one, diagonal matrices.
  • Minimal polynomial of degree two with distinct roots, diagonalizable matrices.
  • Minimal polynomial of degree two with repeated roots.

Since our matrix has only one eigenvalue but is not diagonal, it falls into the third category; i.e its minimal polynomial is $ (x - \lambda)^2 $ where $ \lambda $ is its eigenvalue. Therefore, $ (A - \lambda I)^2 = 0 $; i.e the map $ A - \lambda I $ maps into its kernel.

Now, note that the system given by $ A v_1 = \lambda v_1 $ and $ A v_2 = v_1 + \lambda v_2 $ has a solution pair $ v_1, v_2 $. Indeed, pick $ v_1 $ to be an eigenvector corresponding to the eigenvalue $ \lambda $, and note that the second equation is equivalent to $ (A - \lambda I)v_2 = v_1 $. Since $ v_1 $ spans the kernel and the linear map maps into its kernel, we can choose $ v \notin \textrm{span} \{ v_1 \} $ and appropriately scale to find such a $ v_2 $. Now, choose $ P $ to be the matrix $ (v_1, v_2) $ in column form.

2
On

If $P^{-1} A P = \begin{bmatrix} \lambda & 1\\ 0 & \lambda\end{bmatrix}$, then $A P = P \begin{bmatrix} \lambda & 1\\ 0 & \lambda\end{bmatrix}$. If $P = \begin{bmatrix} | & |\\ p_1 & p_2\\ | & |\end{bmatrix}$, then

$$\begin{bmatrix} | & |\\ A p_1 & A p_2\\ | & |\end{bmatrix} = \begin{bmatrix} | & |\\ \lambda p_1 & p_1 + \lambda p_2\\ | & |\end{bmatrix}$$

Hence,

$$(A - \lambda I_2) \, p_1 = 0_2 \qquad \qquad \qquad (A - \lambda I_2) \, p_2 = p_1$$

Left-multiplying both sides of the latter linear system by $A - \lambda I_2$,

$$(A - \lambda I_2) \, p_1 = 0_2 \qquad \qquad \qquad (A - \lambda I_2)^2 \, p_2 = 0_2$$

where $p_1, p_2$ must be linearly independent, otherwise $P$ is not invertible.