Find all matrices $A\in \mathbb{R}^{2\times2}$ such that $A^2=\bf{0}$

204 Views Asked by At

Attempt:

Let's consider $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$.

$$\begin{align} A^2 &= \begin{bmatrix} a & b \\ c & d \end{bmatrix} \cdot \begin{bmatrix} a & b \\ c & d \end{bmatrix} \\&= \begin{bmatrix} a\cdot a+b\cdot c & a\cdot b + b\cdot d \\ c\cdot a + d\cdot c & c\cdot b + d\cdot d \end{bmatrix}\\ &= \bf{0} \end{align}$$

This gives us the system of equations: $$\begin{align} a\cdot a+b\cdot c &= 0 \tag{1}\\ a\cdot b + b\cdot d = b\cdot(a+d)&= 0 \tag{2}\\ c\cdot a + d\cdot c = c\cdot(a+d)&= 0 \tag{3}\\ c\cdot b + d\cdot d &= 0 \tag{4} \end{align}$$

Now from equation $(2)$ and $(3)$, we have eight cases:

  1. $b = 0$
  2. $c = 0$
  3. $a+d = 0$

and 5 combinations of (1,2,3) which I won't bother listing.

Case 1 ($b=0$):

$b=0$ implies $a = 0$ in equation $(1)$ and $d = 0$ in equation $(4)$. This means that if $A = \begin{bmatrix}0&0\\c&0\end{bmatrix}$ then $A^2=\bf{0}$.

Case 2 ($c=0$):

From symmetry with $b$, $c=0 \implies A=\begin{bmatrix}0&b\\0&0\end{bmatrix}$.

So, we only consider cases where $b\neq0$ and $c\neq 0 $ which leaves us only with case 3 ($a+d=0$).

Case 3 ($a+d=0$):

In equation (1), $a+d=0 \implies a\cdot d - b\cdot c = 0$. So $A=\begin{bmatrix}a&b\\c&-a\end{bmatrix}$ is not invertible, $A^2 = \bf{0}$.

In summary, if $A$ has one of the following forms:

$$\begin{bmatrix}0&b\\0&0\end{bmatrix}, \begin{bmatrix}0&0\\c&0\end{bmatrix}, \begin{bmatrix}a&b\\c&-a\end{bmatrix} \text{ (and not invertible) }$$

then $A^2=\bf{0}$.


Questions:

  1. Is this a correct proof?
  2. What is the standard proof?

"Strange question":

  1. How can I know if there was only the 8 cases? As in, how do I know that only these 8 cases are relevant to $A^2 = \bf{0}$?
3

There are 3 best solutions below

1
On BEST ANSWER

Is this a correct proof?

It's good, but some might quibble with the third case. You require "and $A$ is not invertible" in your summary, which doesn't explicitly put conditions on $a,b,c$.

What is the standard proof?

I'm not sure I can say "standard", but what first comes to mind for me is to know about eigenvalues and Jordan canonical forms. Since $A^2=0$, then $A$'s only eigenvalue is $0$. Then either:

  1. $A\sim\begin{bmatrix}0&0\\0&0\end{bmatrix}$. But that more or less immediately means $A=\begin{bmatrix}0&0\\0&0\end{bmatrix}$.

  2. $A\sim\begin{bmatrix}0&1\\0&0\end{bmatrix}$. This means for some invertible $\begin{bmatrix}a&b\\c&d\end{bmatrix}$ you have: $$A=\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}0&1\\0&0\end{bmatrix}\begin{bmatrix}a&b\\c&d\end{bmatrix}^{-1}=\frac{1}{ad-bc}\begin{bmatrix}-ac&a^2\\-c^2&ac\end{bmatrix}$$ Simplifying the presentation, there is some nonzero $k$ such that $$A=k\begin{bmatrix}-ac&a^2\\-c^2&ac\end{bmatrix}$$ (And allowing $k=0$ actually covers the first case.)

So now we know $A=k\begin{bmatrix}-ac&a^2\\-c^2&ac\end{bmatrix}$ for some real numbers $a,c,k$. Let's view the Cartesian $(a,c)$ in polar coordinates $(r;t)$. Then we know that $A=k\begin{bmatrix}-r^2\cos(t)\sin(t)&r^2\cos^2(t)\\-r^2\sin^2(t)&r^2\cos(t)\sin(t)\end{bmatrix}$. And we can absorb the $r^2$ into $k$, and write $$ \begin{align} A&=k\begin{bmatrix}-\cos(t)\sin(t)&\cos^2(t)\\-\sin^2(t)&\cos(t)\sin(t)\end{bmatrix}\\ &=\frac{k}2\begin{bmatrix}-\sin(2t)&1+\cos(2t)\\\cos(2t)-1&\sin(2t)\end{bmatrix}\\ &=q\begin{bmatrix}-\sin(s)&1+\cos(s)\\\cos(s)-1&\sin(s)\end{bmatrix} \end{align}$$

Conversely, if $A$ is of this form then it is easy to directly show that $A^2=0$.

This form shows that the "shape" of the collection of matrices where $A^2=0$ is a cone. Choose some $q$ in $\mathbb{R}$. Choose some $s$ in $S^1$. Then you have your $A$. The correspondence is one-to-one, except when $k=0$, $t$ is irrelevant.

How can I know if there was only the 8 cases?

Having 8 cases was particular to your approach. So I'm not sure how to answer the question. My approach has two cases. Someone else's might have 16 cases. But your approach had 8 cases, and under that approach there aren't any more because you exhausted the logical options you focused on.

0
On

Your proof is correct, although rather naive---in the sense that you've converted the matrix equation into a system of simultaneous equations and solved it. There's no "standard" proof per se, but there are many more conceptual ways to prove it.

One example is the eigenvalue argument provided by Bungo in the comments. Another way is to note that if $A^2=0$, then the minimal polynomial of $A$ must be $x^2$ (except the trivial case $A=0$), and since the minimal polynomial of a matrix divides its characteristic polynomial, then the characteristic polynomial of $A$ must be $kx^2$ for a constant $k$. It is easy to deduce all the possible forms of $A$ from here on.

The advantage of these more "high-level" arguments is that they generalise easily, to e.g. higher dimensional vector spaces.

0
On

If $A^2=0$, then $A$ is singular. Hence $Au=0$ for some nonzero vector $u$. Pick any vector $v$ that is linearly independent of $u$. Then $Av=au+bv$ for some scalars $a$ and $b$. It follows from $A^2=0$ that $$ 0=A^2v=A(Av)=A(au+bv)=aAu+bAv=bAv=b(au+bv)=abu+b^2v. $$ Thus $b$ must be zero. Hence $Av=au$ and $Au=0$. Conversely, if $Av=au$ and $Au=0$, it is straightforward to verify that $A^2$ maps both $u$ and $v$ to zero. Thus $A^2=0$.

In short, when $A$ is $2\times2$, $A^2=0$ iff there is some basis $\{u,v\}$ such that $Au=0$ and $Av=au$ for some scalar $a$. In terms of matrices, if $P$ denotes the augmented matrix $[u,v]$, we see that $A^2=0$ iff $$ A=P\pmatrix{0&a\\ 0&0}P^{-1} $$ for some nonsingular matrix $P$.