Linearly independent set for sin and cos

519 Views Asked by At

Prove that $ \{1,\sin(x),\cos(x),\sin(2x),\cos(2x),\sin(3x),\cos(3x),...\}$ is linearly independent in $C^{\infty}(R)$.

I know how to show the set $ \{ \sin(x),\cos(x),\sin(2x),\cos(2x)\}$ is linearly independent:

Let $a,b,c,d\in\mathbb{R}$ such that $x\mapsto a\sin x + b\cos x+ c\sin 2x + d\cos 2x$ is the $0$ function. For $x=0$, we get $b+d=0$. For $x=\pi$, $-b+d=0$. Combining the two, $b=d=0$. Similarly we can show $a=c=0$.

I do not think i can use the same approach here and is $C^{\infty}(R)$ just the infinite dim vector space of complex numbers? Is there a different approach that can show all coefficients are 0 in this infinite set?

2

There are 2 best solutions below

4
On

By definition of linear independence (of an infinite set of vectors), we need to show for any positive integer $m, n$, $0 < i_1 < \cdots < i_m, 0 \leq j_1 < \cdots < j_n$, the set $S:= \{\sin(i_1x), \ldots, \sin(i_mx), \cos(j_1x), \ldots, \cos(j_nx)\}$ is linearly independent. To this end, assume \begin{align*} a_1\sin(i_1x) + \cdots + a_m\sin(i_mx) + b_1\cos(j_1x) + \cdots + b_n\cos(j_nx) = 0, \tag{1} \end{align*} where $a_1, \ldots, a_m, b_1, \ldots, b_n \in \mathbb{R}$.

Multiplying on both sides of $(1)$ with $\sin(i_kx), 1 \leq k \leq m$ and then integrating both sides of $(1)$ from $0$ to $2\pi$ yields $a_k = 0$, where we used the orthogonality of Fourier functions: \begin{align*} & \int_0^{2\pi} \sin(Mx)\cos(Nx)dx = 0; \\ & \int_0^{2\pi} \sin(Mx)\sin(Nx)dx = \begin{cases} 0 & M \neq N, \\ \pi & M = N, \end{cases} \\ \end{align*} for non-negative integers $M$ and $N$. Similarly, you can deduce $b_1 = \cdots = b_n = 0$. Therefore $a_1 = \cdots = a_m = b_1 = \cdots = b_n = 0$. That is, $S$ is linearly independent.

0
On

An approach that does not involve Fourier series or integration:

Suppose that the given set is linearly dependent, that is, there is some non-trivial sum $$\sum_{k=0}^na_k\cos kx+b_k\sin kx=0$$

At $x=0$, $$\sum_{k=0}^na_k=0$$

Differentiate the identity twice to get $\sum_{k=0}^n-a_kk^2\cos kx-b_kk^2\sin kx=0$, from which we obtain again (at $x=0$), $$\sum_{k=0}^nk^2a_k=0$$

Repeat this for $2m$ derivatives up to $m=n$: $$\sum_{k=0}^nk^{2m}a_k=0$$

Hence writing these as a matrix, $$\begin{pmatrix}1&1&\cdots&1\\1&2^2&\cdots&n^2\\\vdots\\1&2^{2n}&\cdots&n^{2n}\end{pmatrix}\begin{pmatrix}a_0\\ a_1\\\vdots\\ a_n\end{pmatrix}=\begin{pmatrix}0\\\vdots\\0\end{pmatrix}$$ Now it can be checked that the determinant of the matrix is non-zero (Vandermonde matrix), so $(a_0,\ldots,a_n)=(0,\ldots,0)$.
Repeat for $b_k$ using $x=\pi/2$.


There are other similar approaches. The closest to the asker's method is to write the identity as $\sum_kc_ke^{ikx}=0$ and evaluate at $x=m\pi/n$, $m=0,\ldots,n-1$, to get $\sum_kc_k=0$, $\sum_kc_k\omega^k=0$, etc.