Heisenberg inequality proof

458 Views Asked by At

Suppose $f \in L^2(\mathbb{R})$. Pick $~f_n \in C^{\infty}_{\downarrow}(\mathbb{R})$ such that $$\|f_n-f \|_2 + \|f_{n}^{'}-f'\|_2 = \int(1+4\pi^2\gamma^2)\, | \, \hat f_n-\hat f|^2 \, d\gamma \rightarrow 0 ~~ \text{as}~n\rightarrow\infty,$$ where $f' := [2\pi i \gamma \hat f]^{\check{}}.$

(I've proved that we can always choose such functions $f_n$. Here, $C^{\infty}_{\downarrow}(\mathbb{R})$ means that if $f \in C^{\infty}_{\downarrow}(\mathbb{R}),$ then $f$ is infinitely differentiable and $|x^p||D^q f| \rightarrow 0$ as $|x| \rightarrow \infty$ for any nonnegative integers $p$ and $q$.)

I also know that $f_n \rightarrow f$ uniformly.

What I want to know is:

$$\int_{0}^{x} f'(y) \, dy = \lim_{n\rightarrow\infty} \int_{0}^{x} f_n'(y) \, dy$$ to check that $f'$ $really$ $is$ $the$ $derivative$ $of$ $f$.

The book (Dym, Fourier Analysis) says that this can be easily checked, but I have no idea how to prove this.

Any help will be appreciated!

1

There are 1 best solutions below

4
On BEST ANSWER

Suppose $\{ f_n \} \subset L^2(\mathbb{R})$ is a sequence of absolutely continuous functions with $\{ f_n' \} \subset L^2(\mathbb{R})$. If $\{ f_n \}$ converges in $L^2$ to $f$, and if $\{ f_n' \}$ converges in $L^2$ to $g$, then $f$ is absolutely continuous with $f' = g$. To see that this is true, note that absolute continuity gives $$ f_n(x)-f_n(a) =\int_{a}^{x}f_n'(t)dt. $$ The right side converges pointwise everywhere to $\int_{a}^{x}g(t)dt$ because $\chi_{[a,x]}\in L^2$ and because $\{ f_n' \}$ converges in $L^2$ to $g$, which forces $f_n'\chi_{[a,x]}$ to converge in $L^1$ to $g\chi_{[a,x]}$. A subseqeunce $\{ f_{n_k} \}$ converges pointwise a.e. to $f$, while the above converges pointwise everywhere. Thus $f$ is equal a.e. to a continuous function on $\mathbb{R}$ satisfying $$ f(x)-f(a) = \int_{a}^{x}g(t)dt. $$ So $f$ is equal a.e. to an absolutely continuous function for which $f'=g\in L^2$.