The original question was show that $0$ is an eigenvalue for the matrix $A$. This was a straightforward practice of righthand multiplication of
$Ax = \lambda x \Rightarrow AAx = A \lambda x \Rightarrow 0 = \lambda A x \Rightarrow 0 = \lambda \lambda$ x
Therefore $\lambda$ must be $0$ since $x$ cannot be $0$.
My question is whether one can prove, instead, that $A$ must be linearly dependent if $A^2=0$.
Here's what I've got.
Take the first two columns $c_1, c_2$ of A $\wedge$ the first row $r_1$ of $A$.
Because one can multiply $A$ by $A$, $A$ must be square, we will call it $n \times n$.
Then, $A^2=0$ implies: $$(r_{11})(c_{11}) + (r_{12})(c_{12}) + \cdots(r_{1n})(c_{1n}) = 0$$ and $$(r_{11})(c_{21}) + (r_{12})(c_{22}) + \cdots(r_{1n})(c_{2n}) = 0$$
Where can I go from here to show that $c_1$ and $c_2$ are LD, or can't I?
I assume $A$ is a square matrix.
If the columns of $A$ were linearly independent, then the image space of $A$ would be the whole space and so $A$ would be invertible. But this and $A^2=0$ imply $A=0$ (see $\star$ below), which contradicts the columns of $A$ being linearly independent, since the columns of the zero matrix are linearly dependent.
$(\star)$ Let $B$ be the inverse of $A$. Then $0=0B=A^2B=AAB=AI=A$.