The exercise is the following:
Let $f \in C^1(\mathbb{R})$ be such that both $f$ and $f'$ belong to $L^2(\mathbb{R})$. Show that $\hat{f} \in L^1(\mathbb{R})$.
What I want to prove is that $\int_\mathbb{R} |\hat{f}(\xi)|d\xi < +\infty$ but I do not know how to proceed; I have started saying that $\int_\mathbb{R} |\hat{f}(\xi)|d\xi = \int_\mathbb{R} | \int_\mathbb{R} f(x)e^{-2\pi i \xi x} dx|d\xi $ and then the idea was to use the properties of Fourier transform and of its derivatives such as:
- $f'(x) \hat{\to} 2\pi i \xi\hat{f}(\xi)$
- $-2\pi i x f(x) \hat{\to}\hat{f'}(\xi) $
but these properties require $f$ to be in the Schwartz space, that is not our case.
Any suggestion is appreciated. Thanks in advance for the help.
From the exchanges among @reuns, @RyszardSzwarc and myself here is one possible solution:
The assumptions of the problem imply that
The conclusion follows as indicated by reuns and Ryszard:
$$\hat{f}(s)=(1+s^2)^{-1/2}(1+s^2)^{1/2}\hat{f}(s)$$
and so,
$$\int|\hat{f}|\leq\Big(\int(1+s^2)^{-1}\,ds\Big)^{1/2}\Big(\int(1+s^2)|\hat{f}(s)|^2\,ds\Big)^{1/2}<\infty$$