Let $u \in C^\infty_c(\mathbb R^d)$ and $s> \frac{d}{2}$. Show that
$$ ||u||_{L^\infty}^2 \leq K \int_\mathbb{R}^d |\widehat{u}(\zeta)|^2(1+|\zeta|)^{2s} d\zeta,$$ for some constant $K=K(d,s)$.
Thoughts: Not sure where to start. The $(1+|\zeta|)^{2s}$ makes me think of the rate of decay of the Fourier transform given that its smooth and of compact support. Other than that, not really sure. Don't see how to apply any of the classic inequalities here either.
Hint: Cauchy-Schwarz inequality. Also you will have to evaluate the following integral \begin{align} \int_{\mathbb{R}^n} \frac{d\zeta}{(1+|\zeta|)^{2s}}. \end{align}
Edit: Don't forget to use the Fourier inversion formula.