Basis of the Dual Space of Polynomial Spaces

1k Views Asked by At

I have the following problem:

Let $V=\mathcal{P}_n(\mathbb{R})$ be the vector space of polynomials of degree $\leq n$. Define $\alpha_k : V\to\mathbb{R}$ by

$$ \alpha_k(p)=\int_{-1}^{1}t^kp(t)dt,\qquad p\in V. $$

Show that $\{\alpha_0,\dots,\alpha_n\}$ is a basis for the dual space $V^*$ of $V$.

Here is my proof attempt:


We just need to show linear independence as $\operatorname{dim}(V^*)=\operatorname{dim}(V)=n+1$, and linearly independent sets of the right length form a basis.

We proceed by induction on $n$. The base case is trivial since when $n=0$ we have a set of only one vector and is thus linearly independent.

For the inductive step, suppose the result holds for $i=0,\dots,n-1$, i.e., the set $\{\alpha_0,\dots,\alpha_{n-1}\}$ is linearly independent. Thus whenever $c_0,c_1,\dots,c_{n-1}$ satisfy

$$ \sum_{k=0}^{n-1}c_k\alpha_k=0,\qquad\forall p\in\mathcal{P}_{n-1}(\mathbb{R}) $$

we have $c_0=c_1=\cdots=c_{n-1}$. Let $c_0,\dots,c_n$ satisfy

$$ \sum_{k=0}^{n}c_k\alpha_k=0,\qquad\forall p\in\mathcal{P}_n(\mathbb{R}). $$

Suppose $n$ is even. The case where $n$ is odd is similar, and is omitted.

Let $p(t)=t$. Then we have \begin{equation*} \begin{split} 0=\sum_{k=0}^nc_k\alpha_k(t) &= \sum_{k=0}^{n- 1}c_k\int_{-1}^{1}t^{k+1}dt+c_n\int_{-1}^{1}t^{n+1}dt \\ &= \sum_{k=0}^{n-1}c_k\int_{-1}^{1}t^{k+1}dt \\ &\implies c_0=c_2=\cdots=c_{n-2}=0.\\ \end{split} \end{equation*}

Now let $p(t)=1$. Then \begin{equation*} \begin{split} 0 &= c_1\int_{-1}^{1}tdt+c_3\int_{-1}^{1}t^3dt+\cdots +c_{n- 1}\int_{-1}^{1}t^{n-1}dt+c_n\int_{-1}^{1}t^ndt \\ &= c_n\int_{-1}^{1}t^ndt \\ &\implies c_n=0. \end{split} \end{equation*}

Finally, with $p(t)=t^3$, we have

\begin{equation*} \begin{split} c_1\int_{-1}^{1}t^4dt+c_3\int_{-1}^{1}t^6dt+\cdots +c_{n-1}\int_{-1}^{1}t^{n+2}dt &= 0 \\ \implies c_1=c_3=\cdots=c_{n-1} &= 0. \\ \end{split} \end{equation*}

Hence the set $\{\alpha_0,\dots,\alpha_n\}$ is linearly independent. $\Box$


Does this work? Is there anywhere I left out too much detail or anything that needs to be modified? Any help is appreciated.

3

There are 3 best solutions below

4
On BEST ANSWER

Another proof of linear independence is this:

Let $\lambda_0,...,\lambda_n$ be s.t. $\sum_{i=0}^n \lambda_i\alpha_i=0$. Hence, for all $p \in V$, $$ \int_{-1}^1q(t)p(t)dt=0 $$

where $q(t)=\sum_{i=0}^n\lambda_it^i$. As $q\in V$, we deduce $$ \int_{-1}^1q^2(t)dt=0 $$

hence $q(t)=0$ for all $t\in [-1,1]$. By the fundamental theorem of algebra, if $q$ were not the zero polinomial, it has only $n+1$ complex roots. But, since all $t \in [-1,1]$ are roots of $q$, $q$ is the zero polinomial. Hence, $\lambda_i=0$ for all $i\in \{0,...,n\}$

1
On

The proof is incorrect. You can calculate (easily) that

$$ \alpha_k(t^l) = \begin{cases} \frac{2}{n + l + 1} & \text{if $n + l$ is even} \\ 0 & \text{if $n + l$ is odd} \end{cases}$$

Your argument then goes: take an arbitrary linear combination $c_0 \alpha_0 + \cdots + c_n \alpha_n = 0$ and feed $t$ into it: $$ (c_0 \alpha_0 + \cdots + c_n \alpha_n)(t) = c_0 \alpha_0(t) + \cdots + c_n \alpha_n(t) = \frac{c_1}{2} + \frac{c_3}{4} + \cdots = 0$$ but you cannot conclude from this that $c_1 = 0$, $c_3 = 0$, etc.

0
On

Big Hint: The Legendre Polynomials $\{P_0(t), P_1(t), \ldots, P_n(t)\}$ form a basis for $\mathbb P_n(\mathbb R)$, the vector space of polynomials of degree less than or equal to $n$. Form the matrix $A_{ij}=\alpha_i(P_j(t))$. Use the orthogonality of the Legendre Polynomials to show that $A$ is non-singular. Now if $\sum_k^{n+1} c_k \alpha_k=0$, then for $i=0,1, \ldots, n+1$, $$ 0 = \sum_{k=0}^{n+1} c_k \alpha_k(P_i(t))= \sum_{k=0}^{n+1} c_k A_{ki}. $$

Can you now show that $c=(c_0, c_1, \ldots, c_n)$ must be the zero vector?

If you can prove that $\sum_k^{n+1} c_k \alpha_k=0$ implies that $c$ is the zero vector, you have proven that the $\{\alpha_k\}$ must be independent.