$\newcommand{\bR}{\mathbb{R}}$I'm trying to understand the proof of the Heisenberg inequality, by means of the following inequality
$$\left[\int|f(x)|^2\,\mathrm dx\right]^2\leq4\int|xf(x)|^2\,\mathrm dx\int|f'(x)|^2\,\mathrm dx,$$
where $f,xf,f'\in L^2(\bR)$ and $f'$ is the $L^2$ derivative of $f$ (i.e., $y^{-1}(\tau_{-y}f-f)\to f'$ in the $L^2$ norm as $y\to 0$, where $\tau_{-y}f(x)=f(x+y)$).
The proof method I believe works goes something like this: first, you show that since $f$ is $L^2$ integrable, $f$ is differentiable almost everywhere, on every bounded interval $f$ agrees with an absolutely continuous function, and $f'$ agrees with the pointwise derivative of $f$ almost everywhere. I can do this. Thus by the product rule, for a.e. $x$ we have $$(|f|^2)'(x)=2\overline{f'(x)}f(x)$$ (it is straightforward to see that the $L^2$ derivative of $\overline f$ is $\overline{f'}$). Then we may again apply the product rule to get that for a.e. $x\in\bR$, the function $x\mapsto xf(x)\overline{f(x)}=x|f(x)|^2$ is differentiable and satisfies $$\frac d{dx}\left[x|f(x)|^2\right]=|f(x)|^2+2f(x)\overline{f'(x)}.$$ Integrating both sides, we get that $$\int\frac d{dx}\left[x|f(x)|^2\right]\,\mathrm dx=\int|f(x)|^2\,\mathrm dx+2\int f(x)\overline{f'(x)}\,\mathrm dx.$$ Then the fundamental theorem of calculus for the Lebesgue integral is applied to the left integral, and it is asserted that $$\int\frac d{dx}\left[x|f(x)|^2\right]\,\mathrm dx=\lim_{a\to\infty}\int_{-a}^a\frac d{dx}\left[x|f(x)|^2\right]\,\mathrm dx=\lim_{a\to\infty}\left[x|f(x)|^2\right]_{-a}^a.$$ This is the step I cannot figure out. My version of the fundamental theorem of calculus for the Lebesgue integral only allows this provided that $x|f(x)|^2$ is absolutely continuous on the interval $[-a,a]$ for $a>0$ (Folland Theorem 3.35). I know the following facts:
- The product of absolutely continuous functions is absolutely continuous
- Given any bounded interval $[a,b]\subseteq\bR$, for a.e. $x\in[a,b]$, $f$ agrees with the absolutely continuous function $$x\mapsto f(a)+\int_a^xf'(y)\,\mathrm dy$$ (in particular, $f(x)=f(a)+\int_a^xf'(y)\,\mathrm dy$ for at least those $x$ in the Lebesgue set of $f$).
How can I use these facts to conclude that $$\lim_{a\to\infty}\int_{-a}^a xf(x)\overline{f(x)}\,\mathrm dx=\lim_{a\to\infty}[af(a)\overline{f(a)}+af(-a)\overline{f(-a)}]?$$
Let $g(x)=x|f(x)|^2$. Then, you have $g’=|f|^2+2x\text{Re}(f\overline{f’})$. We have $|f|^2\in L^1$ and $2x\text{Re}(f\overline{f’})\in L^1$ by Cauchy-Schwarz (this is where you get the RHS of the Heisenberg inequality… if it is infinite, the inequality is obvious, so suppose it is finite). Thus, $g’$ being the sum of two $L^1$ functions is $L^1$, so by Dominated convergence ($|g’|$ dominating $|g’|\cdot 1_{[a,b]}$ for all $a,b\in\Bbb{R}$) we have \begin{align} \int_{\Bbb{R}}g’(x)\,dx&=\lim\limits_{\substack{b\to\infty\\a\to-\infty}}\int_a^bg’(x)\,dx. \end{align} Now, $g’$ is absolutely continuous on every compact subinterval as you said, so by the Lebesgue FTC (your second bullet point), we have that for a.e $a,b$ (e.g those in the Lebesgue set), $\int_a^bg’(x)\,dx=g(b)-g(a)$.
This answers the question you asked. But, the real ‘challenge’ is proving that this is equal to zero, i.e that the given assumptions prove that the function $f$ vanishes rapidly enough to ensure the boundary terms contribute nothing. I’ll let you think about this for a while; I may edit afterwards.