Fourier analysis with Lipschitz condition

87 Views Asked by At

Suppose $u$ is a distribution in $\mathbb{R}$ , with compact support $K$ and whose Fourier transform $\hat{u}$ is a bounded function in $\mathbb{R}$ . Let $f\in L^1(\mathbb{R})$ and $\hat{f}=0$ on $K$ and $\hat{f}$ satisfies a Lipschitz condition of order $\frac12$ , i.e. $|\hat{f}(t)-\hat{f}(s)|\leq C|t-s|^{1/2}$ . Prove that then $$\int_{-\infty}^\infty f(x)\hat{u}(x)dx=0$$

We know that $C_c^\infty$ is dense in $L^1$ , therefore we can find a sequence $\{f_n\}\in C_c^\infty$ such that $f_n\to f$ as $n\to\infty$ . Therefore associating with the distribution $u$ we may define $$u(f_n)=\int_{-\infty}^\infty u(x)f_n(x)dx$$ $$\implies\hat{u}(f_n)=u(\widehat{f_n})=\int_{-\infty}^\infty u(x)\widehat{f_n}(x)dx$$ Afterwards I have no idea how to proceed , any help is appreciated .