In one of the questions of my PDE course we are asked to prove that if $f \in S(\mathbb{R})$, where $S$ denotes the Schwartz space then there is a $C>0$ such that:
$||f||_{L^\infty} \leq C\{||f||_{L^2} +||f'||_{L^2}\}$
I started by estimating:
$|f(x)| \leq \frac{1}{2\pi}\int_\mathbb{R}|(1+i\xi)\hat{f}(\xi)|(1+\xi^2)^{-\frac{1}{2}}d\xi$
My professor advised to use the Cauchy-Schwartz inequality, but I don't see that might help in this case. I also thought about using the Parseval's theorem. Hoping someone might help pme by shedding some light on the solution. Thanks in advance