Do properties in linear algebra proved by using matrix transformations hold true irrespective of the choice of the bases for the vector spaces?

57 Views Asked by At

Let us say I am required to prove that V (dimension $= n$) and $\Bbb{R} ^ n$ are isomorphic and have chosen the matrix representation way of doing this.

Assume a linear transformation of

$T :V \rightarrow \Bbb{R} ^n$ that is injective.

The matrix representation is going to be a square matrix and as injectivity implies that the rank of $[T]$ is the dimension of $V$ ($= n$) which in turn means that the matrix is full rank. The rank of $[T]$ will also be equal to the dimension of the domain space and thus it is onto.

I think using this method I have proved that there exists a $T$ such that it is both injective and onto.

My only problem is that the $[T]$ here comes from fixing an ordered basis and I think the proof holds true only for these bases. How can I be sure that changing the basis would keep the proof intact?

PS: If the proof is wrong please include that too in the answer.

1

There are 1 best solutions below

0
On BEST ANSWER

There is no problem in proving something with the use of a basis, because a finite dimensional space does have one.

What's $[T]$? If we assume to take the canonical basis for the codomain, the columns of $[T]$ are precisely the vectors $T(v_1),T(v_2),\dots,T(v_n)$, where $\{v_1,v_2,\dots,v_n\}$ is the chosen basis.

Note that $\{w_1=T(v_1),w_2=T(v_2),w_n\dots,T(v_n)\}$ is also a basis of $\mathbb{R}^n$, because the set is linearly independent, due to $T$ being injective. So we can define a linear map $S\colon\mathbb{R}^n\to V$ by declaring that $$ S(w_i)=v_i,\qquad i=1,2,\dots,n $$ This map is the inverse of $T$: can you see it? So you actually don't need to use $[T]$ at all. Nor you need that the codomain is $\mathbb{R}^n$: it can be any $n$-dimensional vector space.

But you must use bases, because for infinite dimensional spaces it's false that injectivity implies surjectivity.