Use the uncertainty principle to prove that $\int_{\mathbb R^n} (f(x))^2|\hat{f}(x)|^2 \,dx=0$ if and only if $f\equiv 0$.

77 Views Asked by At

Let $f \in L^2(\mathbb R^n)$. Use the uncertainty principle to prove that $\int_{R^n} (f(x))^2|\hat{f}(x)|^2 \,dx=0$ if and only if $f\equiv 0$.

$\leftarrow$: If $f \equiv 0$ then $\int_{\mathbb R^n} (f(x))^2|\hat{f}(x)|^2 \,dx=\int_{\mathbb R^n} (0)^2|\hat{f}(x)|^2 \,dx=\int_{\mathbb R^n} 0 \,dx=0$

Done

$\rightarrow$: If $\int_{\mathbb R^n} (f(x))^2|\hat{f}(x)|^2 \,dx=0$, then $\ldots$ The uncertainty principle states $$\int_{\mathbb R^n} |x|^2|f(x)|^2 \,dx\int_{\mathbb R^n} |y|^2|\hat{f}(y)|^2 \,dx \geq (2 \pi)^{-2n}||f||_{L^2(\mathbb R^n)}^4$$

How am I suppose to use this fact to prove this direction?