If $A$ and $B$ are square matrix such that $AB=I$ then $BA=I$ too.

387 Views Asked by At

If $A$ is a square matrix which coefficients are the scalar of a field $\Bbb K$ then there exist a linear function $f:\Bbb K^n\rightarrow\Bbb K^n$ such that $$ f(e_j):=A_j $$ where $A_j$ is the $j$-th column of $A$ for any $j=1,...,n$.

Then if $I$ is the identity square matrix if $A$ and $B$ are two square matix such that $AB=I$ then there exist two function $f,g:\Bbb K^n\rightarrow\Bbb K^n$ such that the identity square matrix is the matrix of the function $(f\circ g)$ and this means that $(f\circ g)$ is the identity on $\Bbb K^n$. So to prove the statement of the question I have to prove that $(g\circ f)$ is the identity too. So how prove the statement? Could someone help me, please?

4

There are 4 best solutions below

9
On BEST ANSWER

$f\circ g = \text{id}$ implies $g$ is injective (if a composition is injective then the "inner" function is always injective). Since $g:\Bbb{K}^n\to \Bbb{K}^n$ is a map between vector spaces of the same dimension, the rank-nullity theorem implies $g$ is also bijective. Thus, $f = g^{-1}$.

Another way to phrase the argument is to say that since $f\circ g = \text{id}$, then $f$ is surjective (again if a composition is surjective, then the "outer" function is surjective). Again, rank-nullity implies $f$ is bijective, so that again $g=f^{-1}$.

0
On

Let $E = {\Bbb K}^n$. When $AB=I$ on $E$ then both $A$ and $B$ must have full rank because of finite dimension. We have $(BA)^2=BABA=BA$, so $BA$ restricted to the image of $BA$ is the identity, but this image is all of $E$ which allows us to conclude.

0
On

Note the similarity of block matrices: $\begin{pmatrix}I & -A \\\ 0 & I \end{pmatrix}\begin{pmatrix}AB & 0 \\\ B & 0 \end{pmatrix}\begin{pmatrix}I & A \\\ 0 & I \end{pmatrix} = \begin{pmatrix}0 & 0 \\\ B & BA \end{pmatrix}$.

It follows that $AB$ and $BA$ have the same non-zero eigenvalues, counting multiplicities. In particular, if $\lambda = 1$ is an eigenvalue of $AB$, then it is an eigenvalue of $BA$.

1
On

Theorem:

If $A$ and $B$ are two square matrices such that $A B=I$, then $B A=I$.

Proof:

By applying the properties of determinants of square matrices, we get that

$\det A\cdot\det B=\det(A B)=\det I=1\ne0$.

So it results that $\;\det A\ne0\;$ and $\;\det B\ne0\;$.

Now we consider the matrix

$C=\frac{1}{\det A}\cdot\text{adj}(A)$

where $\;\text{adj}(A)\;$ is the adjugate matrix of $A$ which is the transpose of the cofactor matrix of $A$.

$\text{adj}(A)= \begin{pmatrix} A_{1,1} & A_{2,1} & \cdots & A_{n,1} \\ A_{1,2} & A_{2,2} & \cdots & A_{n,2} \\ \vdots & \vdots & \ddots & \vdots \\ A_{1,n} & A_{2,n} & \cdots & A_{n,n} \end{pmatrix}$

where $\;A_{i,j}\;$ is the cofactor of the element $a_{i,j}$ of the matrix $A$.

So $\;A_{i,j}=(-1)^{i+j}\det M_{i,j}$

where $\;M_{i,j}\;$ is the submatrix of $A$ formed by deleting the $i^{th}$ row and the $j^{th}$ column.

We are going to use the Laplace expansions which are the following equalities:

$a_{i,1}A_{j,1}+a_{i,2}A_{j,2}+\ldots+a_{i,n}A_{j,n}= \begin{cases} \det A\;,\quad\text{ if } i=j\\ 0\;,\quad\quad\;\,\text{ if } i\ne j \end{cases}$

$A_{1,i}a_{1,j}+A_{2,i}a_{2,j} +\ldots+A_{n,i}a_{n,j}= \begin{cases} \det A\;,\quad\text{ if } i=j\\ 0\;,\quad\quad\;\,\text{ if } i\ne j \end{cases}$

By applying Laplace expansions, we get that

$A C = C A = I$.

Since for hypothesis $\;A B =I\;,\;$ then $\;C(A B)= C I\;,\;$ hence $\;(C A)B=C,\;$ therefore $\;I B = C\;$ and so we get that $\;B=C$.

Consequently,

$B A = C A = I\;,$

so we have proved that

$B A = I$.