Basis in the vector space of all polynomials

2.5k Views Asked by At

Let $V$ vector space of all polynomials $p(t) = a_0 + a_1t + \cdots + a_nt^n$,$\forall n \in\mathbb{N}$ and $a_0,\ldots,a_n \in\mathbb{R}$. How can I prove that $ \gamma = \{1,t,t^2,\ldots\}$ is a basis of $V$, and use it to find a linear transformation $T:V \rightarrow V$ such that $T$ is surjective but not injective. Is $T(x) = x^2$ an example of this transformation? Some help please.

3

There are 3 best solutions below

4
On BEST ANSWER

You seem to have a fundamental misunderstanding about this question.

$T$ is a transformation from the set of polynomials on $t$ to the set of polynomials on $t$. So, the input to $T$ should be a polynomial, and the output should be some other polynomial. Two common linear transformations are differentiation and integration from $t=0$. Namely, we can describe differentiation operator $T(p) = \frac{dp}{dt}$ by saying that if $p(t) = a_0 + a_1 t + \cdots + a_n t^n$, then $$ T[p(t)] = a_1 + 2a_2 t + \cdots + na_n t^{n-1} $$ Similarly, we can describe the operator $T(p) = \int_0^t p(x)\,dx$ by saying that if $p(t) = a_0 + a_1 t + \cdots + a_n t^n$, then $$ T[p(t)] = a_0t + \frac {a_1}2 t^2 + \cdots + \frac {a_n}{n+1} t^{n+1} $$ Try to prove that the first operator is surjective, but not injective, while the second is injective, but not surjective.


As for your first question: you should look up the definition of a basis, and verify that $\gamma$ satisfies that definition.

Is it true that every member $v \in V$ can be written as a finite sum $$ v = \sum_{i=1}^n a_i v_i $$ where $v_i$ are elements of $\gamma$? Is it true that if $$ \sum_{i=1}^n a_i v_i = 0 $$ for some $a_i$, then all $a_i$ have to be equal to zero? If so, then $\gamma$ is a basis. If you show that this is the case, then you have proven $\gamma$ to be a basis of $V$.

3
On

Since every polynomial is by definition a linear combination of $1,t,t^2\dots$ to prove they form a basis you just need to show that the powers are linearly independent, i.e. $a_01+a_1t+\dots a_nt^n=0$ implies that all $a_i=0$. The simplest way is to differentiate the equality $n$ times, which eliminates all the terms except the last one, leading to $n!\,a_n=0$ and $a_n=0$. Repeating the process shows that $a_{n-1}=0$, $a_{n-2}=0$ and so on.

In finite dimensional spaces there can be no linear transformation which is surjective but not injective, it follows from the Rank-Nullity theorem. Your suggested transformation is not even linear. If you consider the space of all polynomials, which is infinite dimensional with basis $\{1,t,t^2...\}$, then such a transformation can be defined on the basis as $T(t^n):=nt^{n-1}$ for $n>0$, and $T(1):=0$. Then extend it by linearity to all polynomials by $T(p):=a_0T(1)+a_1T(t)+\dots a_nT(t^n)$. It is surjective since every basis element is in the range, but not injective since $0$ and $1$ are mapped into $0$. This transformation happens to be the derivative $T(p)=p'$.

0
On

An alternative to the differentiation transformation is the shift,

$T(p(t)) = \dfrac{p(t)-p(0)}{t}$, which sends $a_0+a_1t+\cdots +a_nt^n$ to $a_1+a_2t+\cdots+a_nt^{n-1}$, i.e., it removes the constant term and decreases the other powers by $1$.

For injective but not surjective, there is the shift in the other direction,

$S(p(t)) = tp(t)$, which increases the powers by $1$. Or $S(p(t))=p(t^2)$, which doubles all the powers.