Let $f \in L^{2}(\mathbb{R}), x f \in L^{2}(\mathbb{R})$ and $\int_{\mathbb{R}} f(x) d x=0 .$ To show that $H f \in L^{1}(\mathbb{R})$

102 Views Asked by At

Question: Let $f \in L^{2}(\mathbb{R}), x f \in L^{2}(\mathbb{R})$ and $\int_{\mathbb{R}} f(x) d x=0 .$ To show that $H f \in L^{1}(\mathbb{R}),$ where $H$ is the Hilbert transform.

Hilbert transform of a function $f \in \mathscr{S}$ is defined as $$Hf=\displaystyle\frac{1}{\pi} \lim _{\varepsilon \rightarrow 0} \int_{|t|>\varepsilon} \frac{f(x-t)}{t} d t,$$ where $\mathscr{S}$ is Schwarz class.

I have no idea how to start, thanks for any help!

1

There are 1 best solutions below

0
On

Divide $\mathbb R$ into $|x|<\epsilon, |x|>\epsilon$, and use Holder inequality.