Is there a link between divergent series and discontinuities in Fourier integrals?

268 Views Asked by At

Take the derivative of the function $$\frac{d}{dx}\frac{e^x-1}{e^x+1}= \frac{d}{dx} \left(1- \frac{2}{e^x+1} \right) = \frac{2e^x}{(e^x+1)^2}$$

At $x=0$, this is equal to $\frac{1}{2}$. However, expanding the denominator, provided $|e^x|<1$,

$$\frac{d}{dx}\left((e^x-1)\sum_{k=0}^{\infty}(-1)^ke^{kx} \right)= \frac{d}{dx}\left(\sum_{k=0}^{\infty}(-1)^ke^{(k+1)x}-(-1)^ke^{kx} \right)$$ $$= \sum_{k=0}^{\infty}(k+1)(-1)^ke^{(k+1)x}-k(-1)^ke^{kx}$$ At $x=0$, the derivative $$=\sum_{k=0}^{\infty}(-1)^k$$ is not defined, however the "two most obvious values" for it is $0$ or $1$ (i.e. $(1-1)+(1-1)+\cdots$ or $1-(1-1)-(1-1)-\cdots$, but of course it could be anything, depending how you do the summation, I think this was proven by Riemann).

Now, if a Fourier integral, $f(x)=\int_0^\infty A(\omega)\cos(\omega x)+B(\omega ) \sin(\omega t) d \omega$, is not continuous at $x_0$, $f(x_0)$ is defined as $$\frac{1}{2} \left( \lim_{x\rightarrow^+x_0}f(x) + \lim_{x\rightarrow^-x_0}f(x) \right)$$ , for instance $f(x)=\begin{cases} k & \text{ if } |x|<1 \\ 0 & \text{ otherwise } \end{cases}$ is assigned the value $\frac{0+k}{2}=\frac{k}{2}$ at $x=\pm 1$.

It seems evident that there is a link between the fact that $e^0=1$ lies on the boundary of divergence for the geometric summation and that the similar results are obtainable using Fourier integrals. Is there, and if so, what is it?

2

There are 2 best solutions below

0
On BEST ANSWER

Your observation about power series could be stated simpler, by formally plugging $x=-1$ into $$\frac{1}{1-x}=\sum_{n=0}^\infty x^n$$ The sum $\sum_{n=0}^\infty (-1)^n$ diverges; partial sums take values $0$ and $1$, and no rearrangement of the series will make it converge. The theorem of Riemann which you mentioned applies to divergent series whose terms tend to zero; for such series it asserts the existence of a convergent rearrangement with any desired sum.

It is true that summation methods exist that attach a well-defined sum to divergent series such as $\sum_{n=0}^\infty (-1)^n$. Perhaps the simplest is the Cesàro summation: instead of the limit of the partial sums $s_0$, $s_1$, $s_2$, ... consider the limit of their averages $s_0$, $\frac12(s_0+s_1)$, $\frac13(s_0+s_1+s_2)$,... For the series $\sum_{n=0}^\infty (-1)^n$ this process yields $1/2$, because the partial sums alternate between $1$ and $0$.

Cesàro summation also helps Fourier series to converge. In general, the Fourier series of a continuous function $f$ may diverge; but in the sense of Cesàro summation it converges to $f$ uniformly. The switch from usual summation to the Cesàro summation amounts to the switch from the Dirichlet kernel to the Fejér kernel.

Another way to look at Fourier series is that they are complex power series restricted to a circle. Unless the function whose Fourier series you consider is extremely nice (analytic), this circle is the boundary of convergence of the corresponding power series. Abel summation mentioned in the other post amounts to approaching a point on the boundary along the radial direction from inside the circle, where the series converges.

But the fact that at jump discontinuities the Fourier series (of a BV function) converges to $\frac12(f(x+)+f(x-))$ is not as deep. For convenience shift the discontinuity to $0$. Introduce $f_e(x)=\frac12(f(x)+f(-x))$ and $f_o(x)=\frac12(f(x)-f(-x))$, which are the even and odd parts of $f$, respectively. The Fourier series of $f$ is the sum of the cosine series of $f_e$ and the sine series of $f_o$. All the sines vanish at $0$. Since $f_e$ has a removable discontinuity at $0$ (and is BV), its cosine series converges to the limit of $f_e$ at $0$, which is $\frac12(f(0+)+f(0-))$.

2
On

this is Abel regularization and it is used for the series

$$ \sum_{n=0}^{\infty} (-1)^{n}e^{-tn} $$ in the limit $ t \to 0 $ we get a defined value for the series..