What about sinusoidal waves makes the Fourier Series so useful?

79 Views Asked by At

Using the Fourier series can allow you to decompose a function completely in a continuous interval into a set of harmonics. An increasingly better approximation can be found by taking more terms in the expansion. However, I could also do this trivially by approximating the function with a series of impulses. The more impulses I add, the better the approximation in the interval. What properties of the Fourier series make it a better decomposition?

2

There are 2 best solutions below

0
On BEST ANSWER

The functions $\sin$ and $\cos$ are so-called eigenfunctions of $D^2$, the square of the differentiation operator, in the sense that $$D^2\sin x = -\sin x,\quad D^2\cos x=-\cos x.$$ Furthermore, we of course have $$D\sin x=\cos x,\quad D\cos x=-\sin x.$$What this means is that it is much easier to work with Fourier series expansions when discussing differential equations. Suppose for instance we wanted to solve the ODE $$(D^2+3D+1)y=y''+3y'+4y=0.$$ One direct way could be to write the expansion $$y=\frac{a_0}{2}+\sum_{n\geq 1}a_n\cos(nx)+b_n\sin(nx)$$ so that $$ \begin{split} y' & = \sum_n-na_n\sin(nx)+nb_n\cos(nx)\\ y'' &= \sum_n-n^2a_n\cos(nx)-n^2b_n\sin(nx). \end{split} $$ Substituting back in, we obtain $$y''+3y'+4y=\frac{a_0}{2}+\sum_n(a_n+nb_n-n^2a_n)\cos(nx)+(b_n-na_n-n^2b_n)\sin(nx)=0.$$ By uniqueness of the Fourier series, we can now find $a_n,b_n$ explicitly.

Of course, this is a somewhat contrived example since there is a much more direct method to solve the ODE, but the idea applies in general: the fact that the derivatives of elements of the set $\{\sin,\cos,-\sin,-\cos\}$ remain in the set is incredibly useful when discussing ODEs and PDEs in general. It is also what makes them more useful in certain cases than things like the Taylor series.

0
On

The sinusoidal functions are important because they describe the vibrational modes of a thin wire. The general solutions of the problem for a vibrating wire were proposed by Bernoulli around 1750, and took the form of the following displacement function $$ u(x,t)=\sum_{n=1}^{\infty}a_n\sin\frac{n\pi x}{a}\cos\frac{n\pi c}{a}(t-\beta_n). $$ This type of function satisfies the wave equation.

The problem was how to match the initial position $u(x,0)$ to a given initial displacement function. This led to what is now called the Fourier series, even though Fourier had nothing to do with developing the series originally. The problem was how to match the initial position of the string $u(x,0)$ to a given initial displacement function $f(x)$. Determining constants $a_n$ to match the initial position was a perplexing problem that was solved by Euler and Clairaut (not Fourier.) They discovered the integral relations that we now refer to as orthogonality for the sine and cosine functions of integral multiples of a base frequency, but they viewed these relations as restrictions on the types of displacement functions that were possible. Fourier conjectured that such expansions were always possible, with no particular evidence at first, and without the backing of most Mathematicians at the time. This was a big topic of debate for quite some time.

The utility of Fourier expansions was in decomposing an initial displacement into harmonic modes of the system (which is where we get the term Harmonic Analysis,) and then evolving these in time in the way suggested by the $u(x,t)$ given above. These "orthogonal expansions" were the first of their type, and they allowed you to isolate the unknown coefficients needed to expand an initial displacement function.

Square waves would not be a good decomposition for the vibrating string problem for several reason, mainly because they are not physical modes of the system.