Prove that an invertible matrix is a square matrix

91 Views Asked by At

So, here's what I'm trying to prove right now. I'm hoping that someone can check my proof and suggest improvements.

Let $A$ be an invertible matrix. Then, $A \in M(n \times n,F)$.


Proof Attempt:

Let $A$ be an invertible matrix. Then, $A$ is associated with an isomorphism given by $f: F^n \to F^m$. We have to show that $n = m$. In other words, we have to prove that $\dim(F^n) = \dim(F^m)$.

To prove that, consider the basis of $F^n$, given by $\beta_1 = (v_1,v_2,\ldots,v_n)$. Then, the images of these basis vectors is going to be the list $f(\beta_1) = (f(v_1),f(v_2),\ldots,f(v_n))$. I claim that this is the basis of $F^m$, given that $f$ is an isomorphism. This is easy to check:

$\forall w \in F^m: \exists! v \in F^n: f(v) = w$

$w = f(v) = f(\sum_{k=1}^{n} \lambda_k v_k) = \sum_{k=1}^{n} \lambda_k f(v_k)$

$w \in L(f(v_1),f(v_2),\ldots,f(v_n))$

That means that that list generates $F^m$. Furthermore, we have:

$\sum_{k=1}^{n} \lambda_k f(v_k) = 0 \implies f(\sum_{k=1}^{n} \lambda_k v_k) = 0$

$\implies \sum_{k=1}^{n} \lambda_k v_k = 0$

$\implies \lambda_1 = \lambda_2 = \ldots = \lambda_n = 0$

This proves linear independence. So, this list forms a basis for $F^m$. However, the length of this basis is exactly $n$. We conclude that $n = m$. This corresponds to the assertion that an invertible matrix must be square.

Also, I'm not sure if I'm being an idiot here but I'm fairly certain that there's a more general result here, where this holds even for matrices associated with linear maps between general vector spaces. I don't know, it seems like the derivation above could be extended to that, assuming it's even correct.

1

There are 1 best solutions below

1
On BEST ANSWER

The proof is fine.

The more general result you are looking for is: If $T$ is an isomorphism between vector spaces $V$ and $W$, then $\dim V=\dim W$. The proof is exactly the same, except for replacing $F^n,F^m$ by $V,W$.