a linear transformation must be injective, supposing it is surjective and also has a special property with respect to spanning sets

679 Views Asked by At

I am trying to produce an elementary proof of the following linear algebra proposition. I have an approach I have pursued, but so far cannot complete the proof. What would be a reasonable next step?

Proposition: Let V and W be non-zero vector spaces, and T be a surjective linear map of V onto W. Assume that property (1) holds:

for any subset S of V, we have that "TS spans W" implies "S spans V".

Prove that $T$ is injective.

Note that because $T$ is onto, the converse $(1)^C$ holds:

for any subset S of V, we have that "S spans V" implies "TS spans W".

My Attempt so far:

Let $X$ denote $V - \ker T$.

$TX$ is able to span $W$, because images of elements of the kernel are no use for spanning $W$. Then (1) tells us that $X$ spans $V$. Let us show that this leads to a trivial kernel, by contradiction.

Express $a$ using elements of $X$, where $a \neq 0$ but $Ta = 0$. $$a = \sum_{i=1}^k \alpha_i x_i.$$

In this expansion no $x_i$ belongs to $\ker T$, so each $Tx_i \neq 0$ and of course each $x_i \neq 0$. We can assume each $\alpha_i$ is non-zero, and also that the $x_i$ are $k \geq 2$ distinct vectors of $V$. Let us also assume that $\{x_1, \dots , x_k\}$ is a basis for a subspace $M$, in which $a$ is found. Then in the image space $TM$ we have

$$0 = \sum_{i=1}^k \alpha_i \, Tx_i,$$

although perhaps we have $Tx_j = Tx_l$ for $j \neq l$.

Remark: this is the point where I could use some help. I did try some more stuff, see below, but I am less hopeful about it. Yet another path I tried is this one: A property of a surjective linear transformation, to do with preserving sets of generators, preserving independence

Looking again at

$$a = \sum_{i=1}^k \alpha_i x_i,$$

we see that $x_k$ is in the span of $a, x_1, \dots , x_{k-1}.$ Therefore $\langle x_1, \dots , x_{k}\rangle=\langle a, x_1, \dots , x_{k-1}\rangle$, and then

$$TM=T\langle x_1, \dots , x_{k}\rangle =T\langle a, x_1, \dots , x_{k-1}\rangle,$$ from which we obtain $$TM=\langle Tx_1, \dots , Tx_{k}\rangle =\langle Tx_1, \dots , Tx_{k-1}\rangle .$$

How to approach a contradiction from here, I do not know.

2

There are 2 best solutions below

0
On

Sketch/hint:
Suppose $T$ were not injective, so $Tv_1=Tv_2$ for some $v_1\not=v_2$. Equivalently (if we let $v=v_2-v_1$) there is some $v\not=0$ with $Tv=0$. There is a basis $B$ for $V$ with $v\in B$ (using that the set $\{v\}$ is linearly independent, and could be extended to a basis). Let $C=B\setminus\{v\}$ and let $S=\mathrm{span}(C)$. Notice that $S$ is a (proper) subspace of $V$ (with $v\not\in S$), and try to prove that $TS$ spans $W$ (in fact $TS=W$).

Verification:
Indeed take any $w\in W$ (with $w\not=0$, if such a $w$ exists), then $w=Tz$ for some $z\in V$. Since $B$ is a basis we have $z=a_1b_1+\dots+a_nb_n$ for some $n\ge1$, some coefficients $a_i$, and some $b_i\in B$, $1\le i\le n$. If $v$ is not among the $b_i$ then we are done. If $v=b_j$ for some $j$ with $1\le j\le n$ then let $y=z-a_jb_j=$
$=a_1b_1+\dots+a_{j-1}b_{j-1}+a_{j+1}b_{j+1}+\dots+a_nb_n$. Then $y\in S$ and $Ty=Tz=w$.

(Note we need not assume that the spaces involved are finite-dimensional. In a variation of the proof, we could have taken $S=C$, instead of $S=\mathrm{span}(C)$.)

0
On

Preliminary Remark: This proof I wrote before I appreciated the fact explained in Hoffman and Kunze: We can define linear independence (LI) and dependence(LD) not on sets of vectors, but rather on $n$-tuples of vectors (i.e. finite sequences $\alpha_1,\dots,\alpha_n$). This introduction of order causes no trouble, for if the $\alpha_j$ are LI, they are distinct, and so we may pass to the set $\{\alpha_j\}_1^n;$ it really has $n$ vectors. "So no confusion arises in discussing bases, dimension. The $\dim V<\infty$ is the largest $n$ such that some $n$-tuple is LI, etc."

Hence this proof displays the type of argument needed if one is ignorant of the advice just summarized.

Lemma: For any given subset (not a sequence) $S'$ of space $W$, there is a subset $S$ of $V$ that is in bijection with $S'$ and such that

  • $TS \,=\,S'$
  • linear independence is preserved in the reverse direction, in the sense that $S$ is linearly independent if $S'$ is linearly independent.

Remarks on Lemma: We do not claim that this bijection is $T$ itself. Rather, it is $T$ restricted to $S$. Also, $(1)$ is not needed to prove this lemma.

Proof of Lemma: see below.

Proof of Proposition:

Let $B'$ be any basis of $W$. Let $B$ be the subset of $V$, in one-one correspondence with $B'$, promised by the lemma.

$TB\,=\,B'$, and $TB$ spans $W$. By property $(1)$, we have that $B$ spans $V$. By the second bullet of the lemma, $B$ is linearly independent, as $B'$ is a basis. Thus $B$ is a basis of space $V$. That makes it clear that $T$ is an injection. For instance, we can show that $\ker T$ is trivial:

Let $Tz\,=\,0$ and $z$ be not zero. Write the unique expression of $z$ using $B$:

$$z \,=\, \sum_{i=1}^k \gamma_i \, b_i.$$

Passing to the range space,

$$0\,=\,Tz \,=\, \sum_{i=1}^k \gamma_i \, Tb_i\,=\, \sum_{i=1}^k \gamma_i \, b'_i.$$

But this is not possible; it is a non-trivial expansion of zero in the basis $B'$.

Proof of Lemma:

If $S'$ is empty, then $S$ empty satisfies the requirement. If $S'$ has elements, denote it $\{v'_\alpha\}_{\alpha \in A}$. Map $S'$ onto $\{{T^{-1}}(v'_\alpha)\}_{\alpha \in A}$, a non-empty family of non-empty sets. This mapping is injective, since $T$ is a function. By the axiom of choice, we map $\{{T^{-1}}(v'_\alpha)\}_{\alpha \in A}$ onto some $\{v_\alpha\}_{\alpha \in A}$. This second mapping is again injective, as distinct pre-images are disjoint. This set $\{v_\alpha\}_{\alpha \in A}$ is the set $S$ that we want. We prove $S$ is linearly independent if $S'$ is:

Suppose there are distinct vectors $v_1, \dots, v_k \in S$ with

$$c_1 v_1 + \dots + c_k v_k\,=\,0.$$

Then $T(c_1 v_1 + \dots + c_k v_k)\,=\,0,$ and so

$$c_1 v'_1 + \dots + c_k v'_k\,=\,0.$$

Hence all the $c_i$ are equal to zero.