If $A^2$ is the zero matrix, show that $A$ is linearly dependent?

1.6k Views Asked by At

The original question was show that $0$ is an eigenvalue for the matrix $A$. This was a straightforward practice of righthand multiplication of

$Ax = \lambda x \Rightarrow AAx = A \lambda x \Rightarrow 0 = \lambda A x \Rightarrow 0 = \lambda \lambda$ x

Therefore $\lambda$ must be $0$ since $x$ cannot be $0$.

My question is whether one can prove, instead, that $A$ must be linearly dependent if $A^2=0$.

Here's what I've got.

Take the first two columns $c_1, c_2$ of A $\wedge$ the first row $r_1$ of $A$.

Because one can multiply $A$ by $A$, $A$ must be square, we will call it $n \times n$.

Then, $A^2=0$ implies: $$(r_{11})(c_{11}) + (r_{12})(c_{12}) + \cdots(r_{1n})(c_{1n}) = 0$$ and $$(r_{11})(c_{21}) + (r_{12})(c_{22}) + \cdots(r_{1n})(c_{2n}) = 0$$

Where can I go from here to show that $c_1$ and $c_2$ are LD, or can't I?

6

There are 6 best solutions below

3
On

I assume $A$ is a square matrix.

If the columns of $A$ were linearly independent, then the image space of $A$ would be the whole space and so $A$ would be invertible. But this and $A^2=0$ imply $A=0$ (see $\star$ below), which contradicts the columns of $A$ being linearly independent, since the columns of the zero matrix are linearly dependent.

$(\star)$ Let $B$ be the inverse of $A$. Then $0=0B=A^2B=AAB=AI=A$.

1
On

Since you have already shown that $A$ has an eigen value as $0$, therefore $\exists \, x \neq 0$ such that $Ax=0$. Thus by definition there exists a non-trivial linear combination of the columns of $A$ which equals $0$, hence columns are linearly dependent.

3
On

As you said, $A$ is a square matrix. Since $A^2=0$, then $$0=\det 0=\det(A^2)=\det(A)^2$$ so $\det A=0$, and this means that the rows and columns of $A$ are LD.

5
On

For another perspective: If a column of $A$ is zero, then the columns are clearly linearly dependent. Otherwise, each column of $A$ lies in the kernel of the map $v \mapsto Av$, hence each column of $A$ determines a linear dependence among the columns of $A$.

0
On

Remember $A$ is linear dependent if and only if $\dim(null(A))\neq0$. Thus see that that if $A^{2}=0$ for a non zero vector $v$ we have $A^{2}v=0$ well if $Av=0$ (i.e. $v\in null(A)$) then we are done, if this isn't true then $Av=w\in range(A)$ (where $w$ is non zero) where then we have $A^{2}v=A(Av)=Aw=0$ thus $w\in null(A)$ thus after exhausting all cases we have that $A$ must be linear dependent.

0
On

Just think the problem is $A*B = 0_{matrix}$, instead of $A^2=0$. Then, analyze it as a matrix*vector product, with each column of B. Just analyzing the first equation, $A*B_1=0_{vector}$, implies, by definition, that the columns of A are linearly dependant.