Matrices: proving that $AA^t=0$ implies $A=0$

675 Views Asked by At

I know that applying trace of $AA^t$ gives $A=0$, but I was trying to formalise the proof I thought:

Suposse $A≠0$. Then for every $X$, $A+X≠X$. So $(A+X)A^t≠XA^t$ for every $X$, arriving in $XA^t≠XA^t$, a contradiction, therefore $A=0$.

But I realise it may not be always true that $A+X≠X$ implies $(A+X)A^t≠XA^t$. But as we can pick any $X$, it seems that the proof is sufficient, but not formal enough, so I am looking for an improvement in this demonstration

I now know that it is a good idea to look to $X^tAA^tX=0$ for any $X$, wich implies $A^tX=0$, concluding the proof, but why is this implication true?

2

There are 2 best solutions below

2
On

You use that $A\neq B$ implies that $CA\neq CB$. However, this is not true in general (already for $C=0$ it is not true, but also for nonzero $C$). To say, "we can pick any $X$" is not a valid argument, I think.

0
On

Along the lines of your first (or last) remark: Since $\langle A^t Ax, x\rangle = \langle Ax, Ax\rangle$ vanishes for any $x$, the matrix $A = 0$.