I am having a hard time answering this question that comes in two parts.
Let $V := C[0,1]$. V is a vector space with the operations of pointwise addition and pointwise scalar multiplication. For each $x\in [0,1]$ define $ T_x: C[0,1] \rightarrow \Bbb R$ by $T_x(f):= f(x)$
- Show that each $T_x \in V^* $
- Show that $\lbrace T_x : x \in [0,1]\rbrace$ is linearly independent and then use this to show that $V^*$ is infinite dimensional.
The first part of the question I'm not sure how to begin.
The second I might have:
Suppose $x,y \in [0,1] $
$T_{x+y}(f) = f(x+y), f \in C[0,1]$
$T_{x+y}(f) = f(x) + f(y) = T_x(f) + T_y(f)$.
Therefore addition conserved
And for scalar multiplication. $T_{\alpha x}(f) = f(\alpha x) = \alpha f(x) = \alpha T_x(f), \alpha \in \Bbb R$
But I don't know how this applies to the infinite dimensional property of $ V^* $ (likely due to the fact I haven't got this right at all)
Help with the proof of either parts is greatly appreciated.
Some Hints: (I'm assuming that $V^*$ symbolizes the algebraic dual space and not the topological dual space)
For the first part, you have to show $T_x \in V^*$. The elements of $V$ are continuous functions, not numbers $x \in [0,1]$. $T_x$ is defined on $V$, so you have to show (for fixed $x \in [0,1]$) that for $f,g \in C([0,1])$ you have $T_x (f+g) = T_x(f) + T_x(g)$ and that $T_x (\alpha f) = \alpha T_x (f)$ for all $\alpha \in \mathbb{R}$, $f \in V$.
For the linear independence, let $x_1,\dots,x_n \in [0,1]$ be distinct and let $c_1,\dots,c_n \in \mathbb{R}$ such that $$c_1 T_{x_1} + c_2 T_{x_2} + \dots + c_n T_{x_n} = 0.$$ Define $$p_{x_1}(y) := \frac{\prod_{i=2}^{n} (y- x_i)}{\prod_{i=2}^{n}(x_1 - x_i)}.$$ What do you get when you calculate $T_{x_i}(p_{x_1})$?