How can I prove that?

42 Views Asked by At

If T : V → W is a linear transformation.

If $\ker T$ has $\{u_1,\dots,u_n\}$ as basis.

If image of $T$ has $\{T(v_1),\dots,T(v_m)\}$ as basis.

How can I prove that

$$ \{v_1,\dots,v_m,u_1,\dots,u_n\} $$

is a basis for $V$?


I'm a inicial student so I don't have a lot of knowlodge.

2

There are 2 best solutions below

0
On BEST ANSWER

Let $B=\{v_1,\ldots,v_m,u_1,\ldots,u_n\}$. There are two things that must be checked: that $B$ spans $V$ and that it is linearly independent.

In order to prove that $\operatorname{span}(B)=V$. Take $v\in V$. Then $f(v)$ can be written as $\alpha_1 f(v_1)+\cdots+\alpha_m f(v_m)$. In other words, $f(v)=f(\alpha_1v_1+\cdots+\alpha_mv_m)$. But then $v-(\alpha_1v_1+\cdots+\alpha_mv_m)\in\ker f$ and therefore it can be written as $\beta_1u_1+\cdots+\beta_nw_n$. So$$v=\alpha_1v_1+\cdots+\alpha_mv_m+\beta_1u_1+\cdots+\beta_nw_n.$$

On the other hand, if$$\alpha_1v_1+\cdots+\alpha_mv_m+\beta_1u_1+\cdots+\beta_nw_n=0,\tag1$$then$$0=f(0)=\alpha_1 f(v_1)+\cdots+\alpha_mf(v_m)$$and therefore every $\alpha_j$ is equal to $0$. But then $(1)$ means that $\beta_1u_1+\cdots+\beta_nw_n=0$ and so every $\beta_k$ is equal to $0$ too.

0
On

Highlights of proof:

Take $\;v\in V\;$ , then

$$Tv\in\text{Im}\,T\implies Tv=\sum_{k=1}^m a_kTv_k\implies T\left(v-\sum_{k=1}^m a_kv_k\right)=0\implies v-\sum_{k=1}^m a_kv_k\in\ker T$$

and now you try to complete the last few steps to finish the proof, after you're sure you can justify all the steps above.

This is practically the proof of the well-known and very important Dimensions Theorem for finite dimensional spaces (in fact, only $\;V\;$ , the definition domain, has to be finite dimensional)