Prove $A$ and $B$ are equivalent if and only if $\text{rank}(A) =\text{rank}(B)$

2.1k Views Asked by At

Knowing that $A$ is equivalent to $B$ if there exists an invertible $m\times m$ matrix $P$ and an invertible $n\times n$ matrix $Q$ such that $PAQ = B$, how can I prove that $A$ and $B$ are equivalent iff $\text{rank}(A) =\text{rank}(B)$?

I've managed to solve the forward direction of the iff and am confident it is correct:

Suppose $A$ and $B$ are equivalent. Then, $PAQ = B$. Knowing this, we can assume $$ \text{rank}(PAQ) \leq \text{rank}(A) = \text{rank}(P^{-1} B Q^{-1}) \leq \text{rank}(B) $$ As $\text{rank}(PAQ) = \text{rank}(B)$, all inequalities must be equalities, so $\text{rank}(A) =\text{rank}(B)$.

I am not sure how to prove this statement in the reverse direction. I think that the invertible matrix theorem could be useful for this problem

1

There are 1 best solutions below

3
On BEST ANSWER

For one way, the answer is correct.

The other way, let $A$ and $B$ be matrices which are equivalent.

Note that the rank of $A$ is equal to the rank of $B$, therefore the dimensions of the image of $A$ and the image of $B$ are the same. Let $k$ be the rank of $A$(and of $B$). Let $\{v_i\}_{i=1,...,k}$ and $\{w_i\}_{i=1,...,k}$ be bases for the images of $A$ and $B$ respectively. Note that $k \leq m,n$ by the fact that row rank equals column rank. So, we complete the bases of the images, to bases for $\mathbb R^m$, and to avoid confusion, the completed bases are $\{v_i\}_{i=1,...,n}$ and $\{w_i\}_{i=1,...,n}$.

Now, we want matrices $P,Q$ such that $PAQ = B$. Think of it this way : $Q$ rewrites the input vector of $B$ in a manner convenient for $A$. Then $A$ does its job on the rewritten vector which it finds easy to work with, and then $P$ rewrites the output of $A$ in a way which $B$ would have written it. That is the break up of the jobs of $P$ and $Q$ : they are one way translators from the input/output language of $A$ to the language of $B$ and vice-versa, if you like.

For each $v_i$, pick a single preimage $e_i$, and for each $w_i$, pick a preimage $f_i$. Now, $\{e_i\}_{i=1,...,k}$ and $\{f_i\}_{i=1,...,k}$ are linearly independent sets (check!) so they can be completed to bases of $\mathbb R^n$. Without confusion, we will call these bases as $\{e_i\}_{i=1,...,n}$ and $\{f_i\}_{i=1,...,n}$ respectively.

Now, what has to be thought here, is that $\{e_i\}$ is like $A$'s mother tongue and $\{f_i\}$ is like $B$'s mother tongue.

Therefore, the task of $Q$, when it receives a vector that is $B$'s mother tongue, is to convert it into $A$'s mother tongue. That leads to a very simple answer : $Q$ is the basis conversion matrix from $f_i$ to $e_i$. That is, $Q$ is the matrix of the unique linear transformation that satisfies $Q(f_i) = e_i$ for all $i = 1,...,n$.

Now that $Q$ has done its job, $A$ receives input in its mother tongue, so it outputs some vector whose entries are in the basis of $\{v_i\}$.

But $B$ outputs in the basis $\{w_i\}$! Obviously, it is clear that $P$, then, must be the unique linear transformation with $P(v_i) = w_i$.

So $Q$ and $P$ are just basis transformation matrices.

Finally, we can provide a proof that $PAQ = B$. Let $x \in \mathbb R^n$.

Then, $x = \sum_{i=1}^n x_if_i$, so $Bx = \sum_{i=1}^k x_iw_i$.

Alternately, by the nature of $Q$, $Qx = \sum_{i=1}^n x_ie_i$. Now, given what $A$ does, $AQx = \sum_{i=1}^k x_iv_i$, and then by what $P$ does, $PAQx = \sum_{i=1}^k x_iw_i$.

Hence $B = PAQ$. Since $P,Q$ are change of basis matrices, they are clearly invertible. They are also seen to be of the right dimension.

This proves the proposition.