We are learning about fourier transfrms in class and I was wondering about solving the following ODE using this method.
So, I want to solve the equation $u''(x)+u(x)=0$. Now, it is clear that a solution is of the form $u(x)=Acos(x)+Bsin(x)$ for all $x\in R$ where A and B are constants. So: $$u''(x)+u(x)=0$$ So applying fourier transform $$\mathscr F (u'')+\mathscr F(u)=0$$ $$(i \omega )^2\mathscr F (u)+\mathscr F(u)=0$$ $$-\omega^2\mathscr F (u)+\mathscr F(u)=0$$ $$(1-\omega^2)\mathscr F (u)=0$$ $$\Rightarrow \mathscr F (u)=0$$ $$\Rightarrow u=0$$
Now although this is a solution, I dont understand why this method unable to produce the more general solution: $u=Acos(x)+Bsin(x)$
From $(1-\omega^2)\mathscr{F}[u](\omega)=0$;
This tells you that $\mathscr{F}[u](\omega)$ has to be zero everywhere except $\omega = \pm1$. So we assume there exists a $a,b$ such that $$\mathscr{F}[u](\omega) = a\delta(\omega +1) +b\delta(\omega -1)$$ When you apply the inverse transform, you get what you expect.