How to take a partial derivative of an integral containing a fourier series?

756 Views Asked by At

(Chapter 2, p. 68, Problem 24) From Goldstein Classical Mechanics.

Problem:

The one-dimensional harmonic oscillator has the Lagrangian $L=m\dot{x}^2/2-kx^2/2.$ Suppose you did not know the solution to the motion, but realized that the motion must be periodic and therefore could be described by a Fourier series of the form $$x(t) =\sum_{j=0}a_j\cos(j\omega t)$$ (taking $t=0$ at a turning point) where $\omega$ is the (unknown) angular frequency of the motion. This representation for $x(t)$ defines a many-parameter path for the system point in configuration space. (That is, a set of (infinite) parameters $a_0, a_1, a_2,\ldots$ defines a path $x = x(t)$). Consider the action integral $I$ between $t = 0$ and $t = T$, where $T = \frac{2\pi}{\omega}$. Show that $I$ is an extremum for nonvanishing $x$ only if $a_j = 0$ for $j\ne1$ , and only if $\omega^2 = \frac{k}{m} $. (Hint: calculate $I$ and take partial derivatives $\frac{\partial I}{\partial a_i} $, which must be 0. Since we are assuming that $x(t)$ repeats with the period $T$, the $j = 1$ term must be nonzero.)

All I know is $\partial I=0$ from Hamilton's principle (where $I = \int Ldt)$ and $L = T - V$, where $T =$ Kinetic energy and $V =$ potential energy. The trouble comes when trying to square a Fourier series (twice) inside the Lagrangian which we much integrate over and take the partial derivatives of with respect to several $a_j's$

1

There are 1 best solutions below

0
On BEST ANSWER

I) The Fourier series

$$\tag{1} x(t) ~=~\sum_{j\in\mathbb{N}_0}a_j\cos(j\omega t) ,\qquad \omega~>~0,$$

is a change of variables from paths $t\mapsto x(t)$ to Fourier coefficients $(a_j)_{j\in\mathbb{N}_0}$. The Lagrangian

$$\tag{2} L~=~\frac{m}{2}\dot{x}^2-\frac{k}{2}x^2, \qquad m,k~>~0,$$

for a harmonic oscillator leads to an action

$$\tag{3} I~=~\int_{t_1}^{t_2}\! dt ~L ~=~\int_{0}^{\frac{2\pi}{\omega}}\! dt ~L ~\stackrel{(1)+(2)}{=}~ \frac{m\omega}{4}\sum_{j\in\mathbb{N}}j^2a_j^2-\frac{k}{4\omega}\sum_{j\in\mathbb{N}_0} a_j^2.$$

II) The stationary condition for the action (3) is

$$\tag{4} \forall j\in\mathbb{N}_0: ~~0~=~\frac{\partial I}{\partial a_j} ~\stackrel{(3)}{=}~ \left(\frac{m\omega}{2}j^2-\frac{k}{2\omega}\right)a_j,$$

which implies that

$$\tag{5} \forall j\in\mathbb{N}_0: ~~ a_j~=~0 \quad\vee \quad\sqrt{\frac{k}{m\omega^2}}~=~j.$$

In other words: At most one variable $a_j$ is non-zero in eq. (5).

  1. If $j_0:=\sqrt{\frac{k}{m\omega^2}}\in\mathbb{N}$, there exists a one-dimensional stationary line, where only $a_{j_0}$ is non-zero. In fact, the entire $a_{j_0}$ dependence drops out of the action (3).

  2. If $\sqrt{\frac{k}{m\omega^2}}\notin\mathbb{N}$, there exists only one stationary point, the trivial configuration, where all the $a_j$ are zero.

III) Finally, we would like to address why the exercise formulation seems to favor the stationary solution $\sqrt{\frac{k}{m\omega^2}}=1$. Eq. (1) implies the "boundary conditions"

$$\tag{6} x(t)~\stackrel{(1)}{=}~x(-t) \quad\text{and} \quad x(t)~\stackrel{(1)}{=}~x(t+\frac{2\pi}{\omega}).$$

Note that all stationary points (5) has $a_0=0$, and note that there is a minus in front of the $a_0^2$ term in the action (3). Hence without further boundary conditions, the stationary points (5) are not minimum points for the action. It seems natural to additionally impose that the average

$$\tag{7} \frac{2\pi}{\omega}a_0~\stackrel{(1)}{=}~\int_{0}^{\frac{2\pi}{\omega}}\! dt~x(t)~=~0$$

should vanish. Let us assume (7) from now on. Then a stationary line with $\sqrt{\frac{k}{m\omega^2}}= 1$ is a minimum for the action (3), while a stationary line with $\sqrt{\frac{k}{m\omega^2}}\in\mathbb{N}\backslash\{1\} $ is only a saddle. Concerning saddle points, see also my related Phys.SE answer here.