Proving the quantitative uncertainty principle

209 Views Asked by At

I am being asked to prove the following quantitative uncertainty principle:

Let $f\in\mathscr{S}(\mathbb{R}^n)$ be Schwartz class, then $$ \lVert f\rVert_{L^2(\mathbb{R}^n)}^2\le\frac{4\pi}{n}\inf_{y\in\mathbb{R}^n}\left(\int_{\mathbb{R}^n}\lvert x-y\rvert^2\lvert f(x)\rvert^2\,\mathrm{d}x\right)^{\frac12}\inf_{z\in\mathbb{R}^n}\left(\int_{\mathbb{R}^n}\lvert\xi-z\rvert^2\lvert\widehat{f}(\xi)\rvert^2\,\mathrm{d}\xi\right)^{\frac12}$$

Now, in the hint, I am told to use the fact that for fixed $y\in\mathbb{R}^n$, $$\lVert f\rVert_{L^2}^2=\frac{1}{n}\int_{\mathbb{R}^n}f(x)\overline{f(x)}\sum_{j=1}^n\partial_j(x_j-y_j)\,\mathrm{d}x$$ integrate by parts, apply Cauchy-Schwartz and Parseval, and use the fact that $$\sum_{j=1}^n\lvert\widehat{\partial_j f}(\xi)\rvert^2 = 4\pi^2\lvert\xi\rvert^2\lvert\widehat{f}(\xi+z)\rvert^2$$ for all $\xi,z\in\mathbb{R}^n$.

So far, I've been able to show that \begin{align} \lVert f\rVert_{L^2}^2 &= \frac{1}{n}\int_{\mathbb{R}^n}f(x)\overline{f(x)}\sum_{j=1}^n\partial_j(x_j-y_j)\,\mathrm{d}x \\ &= \frac{1}{n}\int_{\mathbb{R}^n}(\partial_j f(x)\overline{f(x)} + f(x)\overline{\partial_j f(x)})\sum_{j=1}^n(y_j-x_j)\,\mathrm{d}x \\ &= \frac{2}{n}\Re\left(\int_{\mathbb{R}^n}\partial_j f(x)\overline{f(x)}\sum_{j=1}^n(y_j-x_j)\,\mathrm{d}x \right) \\ &\le \frac{2}{n}\int_{\mathbb{R}^n}\lvert f(x)\nabla f\cdot(y-x)\rvert\,\mathrm{d}x \\ &\le \frac{2}{n}\int_{\mathbb{R}^n}\lvert f(x)\rvert\lvert\nabla f\rvert\lvert y-x\rvert\,\mathrm{d}x \\ &\le \frac{2}{n}\left(\int_{\mathbb{R}^n}\lvert f(x)\rvert^2\lvert x-y\rvert^2\,\mathrm{d}x \right)^{\frac12} \left(\int_{\mathbb{R}^n}\lvert \nabla f\rvert^2\,\mathrm{d}x \right)^{\frac12} \end{align}

And so I'm pretty sure that I have to do something with $$\int_{\mathbb{R}^n}\lvert \nabla f\rvert^2\,\mathrm{d}x = \int_{\mathbb{R}^n}\widehat{\lvert \nabla f\rvert}^2(\xi)\,\mathrm{d}\xi$$

but I don't see how to get $$\sum_{j=1}^n\lvert\widehat{\partial_j f}(\xi)\rvert^2$$ from $\widehat{\lvert \nabla f\rvert}^2(\xi)$. Am I missing something obvious?

1

There are 1 best solutions below

4
On BEST ANSWER

You're on the right track. We have that $$\int_{\mathbb{R}^n}\lvert \nabla f\rvert^2\,\mathrm{d}x = \int_{\mathbb{R}^n}\lvert\widehat{\nabla f}\rvert^2(\xi)\,\mathrm{d}\xi.$$ Further, we have $\widehat{\nabla f}(\xi) = 2\pi i \xi \widehat{f}(\xi)$ which can be shown via integration by parts. Substituting into your last inequality and substituting $\xi = \gamma-z$ yields \begin{align} \lVert f(x)\rVert^2 &\leq \frac{2}{n}\left(\int_{\mathbb{R}^n}\lvert f(x)\rvert^2\lvert x-y\rvert^2\,\mathrm{d}x \right)^{\frac12} \left(\int_{\mathbb{R}^n}\lvert \nabla f\rvert^2(\xi)\,\mathrm{d}\xi \right)^{\frac12}\\ & = \frac{2}{n}\left(\int_{\mathbb{R}^n}\lvert f(x)\rvert^2\lvert x-y\rvert^2\,\mathrm{d}x \right)^{\frac12} \left(\int_{\mathbb{R}^n}\left\lvert\widehat{\nabla f}\right\rvert^2(\xi)\,\mathrm{d}\xi \right)^{\frac12}\\ & \leq \frac{2}{n}\left(\int_{\mathbb{R}^n}\lvert f(x)\rvert^2\lvert x-y\rvert^2\,\mathrm{d}x \right)^{\frac12}2\pi \left(\int_{\mathbb{R}^n}|\xi|^2\lvert \widehat{f}\rvert^2(\xi)\,\mathrm{d}\xi \right)^{\frac12}\\ & =\frac{4\pi}{n}\left(\int_{\mathbb{R}^n}\lvert f(x)\rvert^2\lvert x-y\rvert^2\,\mathrm{d}x \right)^{\frac12}\left(\int_{\mathbb{R}^n}|\gamma-z|^2\lvert \widehat{f}\rvert^2(\gamma-z)\,\mathrm{d}\gamma \right)^{\frac12}, \end{align} where we've used the fact that the norm is shift invariant.