Solving $Ax=B$: what's wrong with this linear algebra argument?

89 Views Asked by At

With $K>L$ and assuming that we are working with real variables, suppose that $B$ is $K\times 1$ and $A$ is $K\times L$ with full column rank. I'm trying to find $x$ ($L\times 1$) satisfying: $$ Ax=B.\tag{i} $$ There are more equations than unknowns $(K>L)$ so there is no guarantee that we can find a solution. Yet, I cannot find what is wrong with this argument: pre-multiply both sides of (i) with $A'$: $$ A'Ax=A'B\implies x=(A'A)^{-1}A'B.\tag{ii} $$ The matrix $A'A$ is invertible because $A$ has full column rank. Could you please explain why (ii) doesn't work? I can see that if $x=(A'A)^{-1}A'B$, then $$ Ax=A(A'A)^{-1}A'B $$ which doesn't readily simplify to $B$. But is this enough to say (ii) is invalid? If (ii) doesn't work, how could I solve (i) or show that no solution exists?

1

There are 1 best solutions below

1
On BEST ANSWER

The difficulty is really logic, not algebra. It is true that $$\def\\#1{{\bf#1}}A\\x=\\b\quad\Rightarrow\quad A'A\\x=A'\\b\ ,$$ but it is not true that $$\def\\#1{{\bf#1}}A\\x=\\b\quad\Leftrightarrow\quad A'A\\x=A'\\b\ .$$ You have shown correctly that if there is a solution it is $\\x=(A'A)^{-1}A'\\b$, but this does not mean that there actually is a solution.