Attempt:
Let's consider $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$.
$$\begin{align} A^2 &= \begin{bmatrix} a & b \\ c & d \end{bmatrix} \cdot \begin{bmatrix} a & b \\ c & d \end{bmatrix} \\&= \begin{bmatrix} a\cdot a+b\cdot c & a\cdot b + b\cdot d \\ c\cdot a + d\cdot c & c\cdot b + d\cdot d \end{bmatrix}\\ &= \bf{0} \end{align}$$
This gives us the system of equations: $$\begin{align} a\cdot a+b\cdot c &= 0 \tag{1}\\ a\cdot b + b\cdot d = b\cdot(a+d)&= 0 \tag{2}\\ c\cdot a + d\cdot c = c\cdot(a+d)&= 0 \tag{3}\\ c\cdot b + d\cdot d &= 0 \tag{4} \end{align}$$
Now from equation $(2)$ and $(3)$, we have eight cases:
- $b = 0$
- $c = 0$
- $a+d = 0$
and 5 combinations of (1,2,3) which I won't bother listing.
Case 1 ($b=0$):
$b=0$ implies $a = 0$ in equation $(1)$ and $d = 0$ in equation $(4)$. This means that if $A = \begin{bmatrix}0&0\\c&0\end{bmatrix}$ then $A^2=\bf{0}$.
Case 2 ($c=0$):
From symmetry with $b$, $c=0 \implies A=\begin{bmatrix}0&b\\0&0\end{bmatrix}$.
So, we only consider cases where $b\neq0$ and $c\neq 0 $ which leaves us only with case 3 ($a+d=0$).
Case 3 ($a+d=0$):
In equation (1), $a+d=0 \implies a\cdot d - b\cdot c = 0$. So $A=\begin{bmatrix}a&b\\c&-a\end{bmatrix}$ is not invertible, $A^2 = \bf{0}$.
In summary, if $A$ has one of the following forms:
$$\begin{bmatrix}0&b\\0&0\end{bmatrix}, \begin{bmatrix}0&0\\c&0\end{bmatrix}, \begin{bmatrix}a&b\\c&-a\end{bmatrix} \text{ (and not invertible) }$$
then $A^2=\bf{0}$.
Questions:
- Is this a correct proof?
- What is the standard proof?
"Strange question":
- How can I know if there was only the 8 cases? As in, how do I know that only these 8 cases are relevant to $A^2 = \bf{0}$?
It's good, but some might quibble with the third case. You require "and $A$ is not invertible" in your summary, which doesn't explicitly put conditions on $a,b,c$.
I'm not sure I can say "standard", but what first comes to mind for me is to know about eigenvalues and Jordan canonical forms. Since $A^2=0$, then $A$'s only eigenvalue is $0$. Then either:
$A\sim\begin{bmatrix}0&0\\0&0\end{bmatrix}$. But that more or less immediately means $A=\begin{bmatrix}0&0\\0&0\end{bmatrix}$.
$A\sim\begin{bmatrix}0&1\\0&0\end{bmatrix}$. This means for some invertible $\begin{bmatrix}a&b\\c&d\end{bmatrix}$ you have: $$A=\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}0&1\\0&0\end{bmatrix}\begin{bmatrix}a&b\\c&d\end{bmatrix}^{-1}=\frac{1}{ad-bc}\begin{bmatrix}-ac&a^2\\-c^2&ac\end{bmatrix}$$ Simplifying the presentation, there is some nonzero $k$ such that $$A=k\begin{bmatrix}-ac&a^2\\-c^2&ac\end{bmatrix}$$ (And allowing $k=0$ actually covers the first case.)
So now we know $A=k\begin{bmatrix}-ac&a^2\\-c^2&ac\end{bmatrix}$ for some real numbers $a,c,k$. Let's view the Cartesian $(a,c)$ in polar coordinates $(r;t)$. Then we know that $A=k\begin{bmatrix}-r^2\cos(t)\sin(t)&r^2\cos^2(t)\\-r^2\sin^2(t)&r^2\cos(t)\sin(t)\end{bmatrix}$. And we can absorb the $r^2$ into $k$, and write $$ \begin{align} A&=k\begin{bmatrix}-\cos(t)\sin(t)&\cos^2(t)\\-\sin^2(t)&\cos(t)\sin(t)\end{bmatrix}\\ &=\frac{k}2\begin{bmatrix}-\sin(2t)&1+\cos(2t)\\\cos(2t)-1&\sin(2t)\end{bmatrix}\\ &=q\begin{bmatrix}-\sin(s)&1+\cos(s)\\\cos(s)-1&\sin(s)\end{bmatrix} \end{align}$$
Conversely, if $A$ is of this form then it is easy to directly show that $A^2=0$.
This form shows that the "shape" of the collection of matrices where $A^2=0$ is a cone. Choose some $q$ in $\mathbb{R}$. Choose some $s$ in $S^1$. Then you have your $A$. The correspondence is one-to-one, except when $k=0$, $t$ is irrelevant.
Having 8 cases was particular to your approach. So I'm not sure how to answer the question. My approach has two cases. Someone else's might have 16 cases. But your approach had 8 cases, and under that approach there aren't any more because you exhausted the logical options you focused on.