If matrices A, B and C are all non-zero matrices, are statements I and II true or false

477 Views Asked by At

Statement I: $AB = BC$ implies $A = C$.

Statement II: $AB = AC$ implies $B = C$.

Statement I= False. Find the inverse of B and multiply the equation. $B^{-1}AB = C$. Hence, this statement is false as A is not equal to C.

Statement II= False. This is only true when A is invertible. However, if $A$ is not invertible, this statement is false. If $A$ is not invertible then $A^{−1}$ doesn’t exist.

Is my explanation right for this?

2

There are 2 best solutions below

0
On BEST ANSWER

Throughout, I assume that we are talking about matrices with real entries though much of what I say would still be true for other fields.

Don't try to prove that the statements are never true as sometimes they are. See below for examples. Saying that statements like this are not true is normally interpreted as "not always true" which is equivalent to "sometimes false". A simple but effective way to prove this is to just give an example when they fail.

Here are some very simple examples which show that statement I is not always true.

$A = \begin{bmatrix}2 & 2\\0 & 0\end{bmatrix}$

$B = \begin{bmatrix}1 & 1\\0 & 0\end{bmatrix}$

$C = \begin{bmatrix}2 & 2\\2 & 2\end{bmatrix}$

A problem with your argument is that you have just obtained another expression, $B^{-1}AB = C$, which is sometimes but not always true. $B$ might be invertible and $A$ and $B$ might commute.

Try to do the same for statement II. Find some simple examples for which it does not work. Obviously, $A$ will need to be non-invertible.

Statement II will be true when $A$ is invertible but not only then, it might be true in some other cases. It is true in the subset of $2 \times 2$ matrices of this form: $\begin{bmatrix}x & 0\\0 & 0\end{bmatrix}$ provided that $B \neq 0$ even though none of these matrices are invertible. This is a trivial example as this subset is isomorphic to the real numbers or whatever field your entries are from. (Provided that you don't require an isomorphism to preserve the identity.)

For a more interesting example, the same will apply to matrices of this form: $\begin{bmatrix}x & -y\\y & x\end{bmatrix}$. In this case, the subset is isomorphic to the complex numbers (provided that the entries are in the real numbers.) So, they will all commute and all except $\begin{bmatrix}0 & 0\\0 & 0\end{bmatrix}$ will be invertible.

1
On

For statement II, suppose A is not a square matrix so A is not invertible, but if A is left invertible, and you multiply the left invertible matrix of A on both sides, II is true.