A relation about invertible and nonsingular matrices

339 Views Asked by At

In the lectures I am following, we are trying to show that

$AB = I \implies B=A^{-1}$, given that A and B are $n \times n$ square matrices. Of course we don't know if A and B are invertible or nonsingular etc. First we need to show these. It follows in the lectures that for a $\vec y \in R^n$,

$$A(B\vec y) = \vec y$$ and thus for every $\vec y$, there is a solution $B \vec y$. Thus the system is consistent with its coefficient matrix as $A$. Then the proof says $A$ must be nonsingular.

I am lost at this reasoning. How did we jump to the fact that $A$ is nonsingular? How can I know that $B \vec y$s are unique for all $\vec y$? Maybe the system has infinitely many solutions?

For reference, this is from Theodore Shifrin's Math 3500 Lectures on Youtube, Day 33, around time 35:00.

1

There are 1 best solutions below

6
On BEST ANSWER

$AB = I \implies A, B$ are non-singular matrices.

Proof by contradiction:

Suppose $A$ is singular. $\det(A) = 0.$

Since $\det(AB) = \det(A)\det(B),$ if $\det(A) = 0$ then $\det(AB) = 0.$

$\det(I) =1.$ $\det(AB) \ne \det(I) \implies AB\ne I$

This violates the given condition that $AB= I.$

$A$ is non-singular.

Same logic can be applied for $B.$