When is Linear Independence sufficient for forming a basis?

323 Views Asked by At

I am working through a question at the moment to show that if some Linear Operator $T: \mathbb{R}^n \to \mathbb{R}^n$ is injective, and $\{v_1, ..., v_n \}$ is a basis of $\mathbb{R}^n$, then $\{T(v_1), ..., T(v_n) \}$ is also a basis.

From what I have found, it is sufficient to just show Linear Independence of the transformed basis, as span does not need to be shown for "dimension reasons". However, I'm not exactly sure what this means, or when this applies. Could someone explain?

2

There are 2 best solutions below

0
On BEST ANSWER

In $R^n$ a family of $n$ linear independent vectors always form a Basis. One first of all shows that every Basis must have the same length (hence dimension is well-defined) and then notice that every linear independent family can be extended to a basis.

In your case assume we have $0=\sum_{i=1}^n a_i\cdot T(v_i)$ for elements $a_i\in\mathbb{R}$ use linearity to obtain $0=T(\sum_{i=1}^n a_i\cdot v_i)$ and by injectivity we have $0=\sum_{i=1}^n a_i\cdot v_i$ and by linear independence we now obtain $a_i=0$ for every $i$ and hence the family $\{T(v_1),\cdots,T(v_n)\}$ is a basis.

0
On

An injective linear map has a kernel of dimension$~0$. If the linear map is from a finite dimensional vector space to itself (or to another space of the same finite dimension), then by the rank-nullity theorem its rank is the same as the dimension of the space of arrival, so that the map is surjective as well, and therefore an isomorphism of vector spaces. In general, the image of a basis of $V$ under an isomorphism of vector spaces $V\to W$ is a basis of$~W$, and this applies in your case.