(an exercise in logic) Consider a general system $AX = B$ of $m$ linear equations in $n$ unknowns, where $m$ and $n$ are not necessarily equal. The coefficient matrix $A$ may have a left inverse $L$, a matrix such that $LA = I_n$. If so, we may try to solve the system as we learn to do in school:$$AX = B,\quad LAX = LB,\quad X = LB.$$But when we try to check our work by running the solution backward, we run into trouble: If $X = LB$, then $AX = ALB$. We seem to want $L$ to be a right inverse, which isn't what was given.
- Work some examples to convince yourself that there is a problem here.
- Exactly what does the sequence of steps made above show? What would the existence of a right inverse show? ...
(Source: M. Artin, Algebra, 2nd ed., exercise M.8 of chapter 1, p.35.)
First, we write down the dimensions of the matrices: $A$ is $m \times n$, $X$ is $n \times 1$, $B$ is $m \times 1$, and $L$ is $n \times m$.
1.
Take $(m,n)=(3,1)$, noting that $m>n$,$$A = \begin{pmatrix}1 \\ 5 \\ 4\end{pmatrix}$$with left inverse $L = (1-5v-4w \text{ }\,\, v \text{ }\,\,w)$ – here $v,w$ are fixed but arbitrary, and$$B = \begin{pmatrix}b_{11} \\ b_{21} \\ b_{31}\end{pmatrix}.$$
Then $AX=B$, where $X=(x_{11})$, gives the simultaneous equations$$x_{11} = b_{11},\text{ }5x_{11} = b_{21},\text{ }4x_{11} = b_{31}.$$Thus, $AX=B$ is solvable if and only if$$B = \begin{pmatrix}t \\ 5t \\ 4t\end{pmatrix}$$for some $t$.
Observe that $LB = ((1-5v-4w)b_{11} + vb_{21} + wb_{31})$ gives us the correct solution, $t$, when $B$ takes on the aforementioned form, but otherwise,$$ALB = \begin{pmatrix}(1-5v-4w)b_{11} + vb_{21} + wb_{31} \\ 5((1-5v-4w)b_{11} + vb_{21} + wb_{31}) \\ 4((1-5v-4w)b_{11} + vb_{21} + wb_{31})\end{pmatrix} \ne B.$$
2.
Suppose $A$ has a left inverse $L$. The sequence of steps made in the problem statement correctly shows that if $AX=B$ for some $n$-vector $X$, then $X=LB$, which yields $ALB=B$; in particular, $AX=B$ has at most one solution. However, as shown above, $X$ need not exist.
Now suppose $A$ also has a size $n\times m$ right inverse $R$, so $AR=I_m$. Then $$L=LI_m=LAR=I_nR=R,$$where we let $Z=L=R$. If $m=n$, we know $Z=A^{-1}$ uniquely satisfies $$AZ=ZA=I_m=I_n,$$so assume $m\ne n$; without loss of generality, $m>n$.
We focus on $AZ=I_m$: if $U_1$, $U_2$, $\dots$ , $U_m$ denote the row $n$-vectors of $A$ and $V_1,V_2,\ldots,V_m$ denote the column $n$-vectors of $Z$, then $U_i\cdot V_j = [i=j]$. Since $m>n$, the vectors $U_1$, $\dots$ , $U_m$ are linearly dependent, i.e. there exist scalars $\alpha_1$, $\dots$ , $\alpha_n$, not all zero, such that $\sum\alpha_i U_i = 0$. But then for each $j$, we must have$$ \alpha_j = \sum\alpha_i[i=j] = \sum\alpha_i U_i\cdot V_j = 0,$$contradiction.
If $A$ has a left inverse $L$, it is in some sense more natural to think about "left systems" $YA = C$ than "right systems" $AX = B$, since the former always has a solution, while the latter does not, as shown above – if $LA = I_n$, then $(CL)A = C$. Of course, there may be solutions other than $CL$.
This observation is essentially the idea behind the last thing we did above. The $m$ systems $U_i\cdot V_j = [i=j]$ given by fixing $j$, with coefficients corresponding to $U_i$, form a "basis" for the more general systems $U_i\cdot V = c_i$, where $i=1,\,2,\,\dots\,,m$. Intuitively, we need $m\le n$ for these general systems to always have solutions.