Find all $a, b, c$ values which $\sin(ax) , \sin(bx)$ and $\sin(cx)$ are linearly independent

236 Views Asked by At

Problem 7 on p.324 of Arnold's Ordinary Differential Equations asks:

Find all $a, b, c$ values for which the three functions $\sin(ax) , \sin(bx)$ and $\sin(cx)$ are linearly independent.

I computed the Wronskian, but could not come up with $a,b$ and $c$ values.

1

There are 1 best solutions below

0
On

In this case the straighforward obstructions for linear independence turn out to be the only obstructions (that is, we don't want any frequency to be zero nor two frequencies being the same possibly up to a sign).

To see this, put for $r$ a number $\sigma_r:x\mapsto \sin(rx)$, $\gamma_r:x\mapsto \cos(rx)$, so that

$$\sigma_r' = r\gamma_r, \gamma_r' = -r\sigma_r,$$

and taking higher order derivatives one has for any $k\in\mathbb{Z}_{\geq1}$,

$$\sigma_a^{(2k-1)}=(-1)^{k-1}a^{2k-1}\gamma_a$$

Let $A,B,C$ be numbers such that

$$A\sigma_a + B\sigma_b + C\sigma_c = 0.$$

Differentiating this equation $2k-1$ times and evaluating at $x=0$ (and factoring out $-1$ when needed), one obtains

$$Aa^{2k-1}+Bb^{2k-1}+Cc^{2k-1}=0.$$

Putting the equations for $k=1,2,3$ into matrix form one obtains

$$ \begin{pmatrix} a & b & c \\ a^3 & b^3 & c^3 \\ a^5 & b^5 & c^5 \end{pmatrix} \begin{pmatrix} A \\ B \\ C \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix} $$

In order for $\sigma_a,\sigma_b,\sigma_c$ to be linearly independent it is necessary and sufficient for this matrix equation to have only the trivial solution, which is equivalent to

$$ \det\begin{pmatrix} a & b & c \\ a^3 & b^3 & c^3 \\ a^5 & b^5 & c^5 \end{pmatrix} \neq0.$$

Note that the determinant question is a multiple of a Vandermonde determinant:

$$ 0\neq\det\begin{pmatrix} a & b & c \\ a^3 & b^3 & c^3 \\ a^5 & b^5 & c^5 \end{pmatrix} =abc\det\begin{pmatrix} 1 & 1 & 1 \\ a^2 & b^2 & c^2 \\ a^4 & b^4 & c^4 \end{pmatrix} =abc(b^2-a^2)(c^2-a^2)(c^2-b^2). $$

Thus $\sigma_a,\sigma_b,\sigma_c$ are linearly independent iff

$$abc(b^2-a^2)(c^2-a^2)(c^2-b^2)\neq0.$$

Note that this argument extends to any (finite) number of sine functions.


An alternative method is via Laplace transforms. Recall that

$$\mathcal{L}(\sigma_r)(s)=\int_0^\infty e^{-sx}\sin(rx)dx=\dfrac{r}{s^2+r^2}$$

where $s\in\mathbb{C}$ with positive real part. Then taking the Laplace transform of

$$A\sigma_a+B\sigma_b+C\sigma_c = 0$$

and collecting the rational functions, one obtains

$$(Aa+Bb+Cc)s^4+[Aa(b^2+c^2)+Bb(a^2+c^2)+Cc(a^2+b^2)]s^2+[Aa(bc)^2+Bb(ac)^2+Cc(ab)^2]=0,$$

which again gives three linear equations in the three unknowns $A,B,C$, and writing these equations in matrix form the coefficient matrix is again computed to have the same algebraic expression in $a,b,c$ for its determinant. This method too scales in the number of sine functions, though admittedly computing the determinant gets more painful faster.