About inverse matrixes

97 Views Asked by At

I've been reading about invertible matrixes. I have a few questions:


One theorem says

The rank of an invertible matrix of size $n$ is $n$.

So, is it safe to say that all invertible matrixes have one unique solution?


Another says

If a matrix is invertible, then its inverse is unique.

So basically for a matrix $A$, there is only one matrix $B$ that fulfils

$$BA = I = AB$$

Right? If I have a matrix $C$ that is equivalent to $B$, $C$ still would not be considered as the inverse of $A$, yes?


If $A$ is invertible, and $B$ is equivalent to $A$, is $B$ invertible as well? If yes, is the inverse of $B$ equivalent to the inverse of $A$?


Edit

My concept of "equivalence" is that, given two matrixes $A$ and $B$, they are equivalent if and only if one of them is created by applying row operations on the other. Thus, both matrixes share the same solution set (although their rows look different).

2

There are 2 best solutions below

0
On

For your first question: you seem to have the right idea. The problem $$ A \vec x = \vec b $$ (Where $A$ is am $n \times n$ square matrix and $\vec b$ is a length $n$ column vector) will have a unique solution if and only if (that is, exactly when) $A$ has an inverse. In this case, we can solve the problem by "multiplying" by the inverse on both sides. That is, if $A^{-1}$ is the inverse of $A$, we have $$ A^{-1} A \vec x = A^{-1}\vec b \implies \overbrace{(A^{-1} A)}^{I} \vec x = A^{-1}\vec b \implies \vec x = A^{-1}\vec b $$ For your second question: yes, matrix inverses are unique. Here's a proof:

"Row-operations" correspond to multiplication on the left by some invertible matrix (which we normally refer to as "elementary matrices"). In fact, any sequence of row operations can be represented as multiplication on the left by a single invertible matrix, and visa versa.

So, suppose that $A$ is invertible, $B$ is the inverse of $A$, and $C$ is equivalent to $B$. That is, we have $AB = BA = I$, and $B = EC$ for some invertible matrix $E$. Now, suppose $C$ is also an inverse of $A$. We then have $$ CA = I \implies (EB)A = I \implies E(BA) = I \implies E = I $$ That is, we would have to have $E = I$, which means that $C = IB = B$.

0
On

First, if $C \sim B$ and $B = A^{-1}$, then it does not necessarily hold that $C = A^{-1}$: the condition $C \sim B$ is the same as $C = RB$ for some invertible matrix $R$ which achieves the row operations carrying $B$ to $C$. Then $$ CA = RBA = R \neq I $$ unless $C=B$.

Next, if $A$ is invertible and $B \sim A$, then yes, $B$ is invertible: you can write $B=RA$ for some invertible matrix $R$ representing the row operations carrying $A$ to $B$. Then $$ B^{-1} = A^{-1} R^{-1} $$