Why does the Fourier sine series of $x^2$ on $[0,l]$ converge to 0?

308 Views Asked by At

When expanding, for example, $x^2$ on $[0,l]$ as a sine series, we get

$f(x) = \sum_1^{\infty}b_n sin(\frac{n\pi x}{l})$

If we plug in $x=l$ to this expansion, we get $f(x)=0$. Why aren't we getting $\frac{f(l^+) + f(l^-)}{2} = \frac{l^2+0}{2}$?

1

There are 1 best solutions below

1
On BEST ANSWER

It comes from Dirichlet theorem.

Since you expand in sine series, you are actually expanding the odd periodic function given on one period, $[-l,l]$, by $f(x)=x^2$ if $x\in[0,l]$, and $f(x)=-x^2$ if $x\in]-l,0]$.

This $f$ is continuously differentiable except at a finite number of discontinuities, so the Fourier series converges, but not to $f$. It converges to the regularized $f$, or

$$\hat f(x)=\frac{f^+(x)+f^ -(x)}2$$

When $x=l$, you can easily check that $f^-(l)=l^2$, but $f^+(l)=-l^2$, thus your series converges to zero at $x=l$. Actually, you can even show each term is $0$, since they have the form $a_k\sin\frac{k\pi x}{l}=a_k\sin\frac{k\pi l}{l}=a_k\sin k\pi=0$.