I am wondering that, consider there are $m$ linear equations with $n$ unknowns. We can represent it as $AX=B$. Let $L$ is the left inverse of $A$ therefore $LA=I$. Again from $AX=B$, we get $LAX=LB$ implies $X=LB$. Till this I have no problem but from $X=LB$, multiplying it by $A$ we get $AX=ALB$ implies $B=ALB$. So does it imply also $AL=I$ ?
is it true every left inverse of a matrix is also right inverse of it?
2.9k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 4 best solutions below
On
If $A$ is square and of full rank, and $L$ is its left inverse and $R$ is its right inverse, then from
$$LA = I$$
We get (if we multiply both sides by $R$ from the right)
$$LAR = IR\\ LI = IR\\ L = R$$
However, if $A$ is not square, then one of the two inverses does not exist and the other is not unique, so you cannot draw the same conclusion.
For example: $A=[1,1]$ has more than one right inverse: $$B=[\frac12-t, \frac12 +t]$$ is an inverse of $A$ for every $t$, but it has no left inverses, because for every $C\in\mathbb R^{2, 1}$, the matrix $CA$ has rank $1$ or $0$, so it cannot be equal to $I$ which has rank $2$.
On
It's not true for non-square matrices. Consider $$\begin{pmatrix} 1 & 1 & 1 \\ 0 & 1 & 1 \end{pmatrix}\begin{pmatrix} 1 & -1 \\ 0 & 0 \\ 0 & 1\end{pmatrix}=\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$$ and $$\begin{pmatrix} 1 & -1 \\ 0 & 0 \\ 0 & 1\end{pmatrix}\begin{pmatrix} 1 & 1 & 1 \\ 0 & 1 & 1 \end{pmatrix} = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 1\end{pmatrix}.$$
On
Case 0. If $A$ is square, then the answer is YES. Use determinants: suppose $LA = I$. Then $\mathrm{det}(L)\mathrm{det}(A) = 1$. So $\mathrm{det}(A)$ is non-zero. So $A$ has a two-sided inverse. Now use:
Proposition. If an element $A$ of a monoid has a two-sided inverse, then every left inverse of $A$ is a two-sided inverse.
So from $LA = I$ we deduce that $AL=I$.
Case 1. If $A$ isn't square, then answer is a big NO. For example, define a matrix $A$ as follows: $$A = \begin{bmatrix}1\\0\end{bmatrix}$$
For each $\lambda \in \mathbb{R}$, define a matrix $L_\lambda$ a follows: $$L_\lambda = \begin{bmatrix}1 & \lambda\end{bmatrix}.$$
Clearly: $$L_\lambda A = 1.$$
But $$A L_\lambda = \begin{bmatrix}1 & \lambda \\ 0 & 0\end{bmatrix}$$
Certainly not in general.
Let'see this from the point of view of linear maps: $A$ is the matrix associated with a linear map $f\colon\mathbf R^m\to\mathbf R^n$, $L$ is associated with a linear map $u\colon\mathbf R^n\to\mathbf R^m$. $LA=I_m$ means $\;u\circ f=\operatorname{id}_{\mathbf R^m}$, which implies $f$ is injective and $u$ surjective.
On the other hand $AL=I_n$ would mean $\;f\circ u=\operatorname{id}_{\mathbf R^n}$, which would imply $f$ surjective and $u$ injective, whence both would be isomorphisms.
This is of course impossible if $m\neq n$. If $m=n$, we know that for an endomorphism in finite dimension, injective $\iff$ surjective $\iff$ bijective.