Pointwise convergence implies $L^p$

3.9k Views Asked by At

Simply, why is it that convergence pointwise, $u_j \rightarrow u$, implies convergence in $L^p$ if $|u_j(x)| \le g(x)$ for some $g$ in $L_+^p$?

2

There are 2 best solutions below

0
On

Simply, Lebesgue's dominated convergence theorem. "Domination" by $g$ is necessary: a traveling square wave $f_n=\chi_{[n,n+1]}$ converges pointwise to zero but has norm 1 for all $p$.

0
On

Since $u_j \to u$ pointwise and $|u_j(x)| \leqslant g(x)$, there is $$ |u(x)|=\lim_{j\to\infty}|u_j(x)|\leqslant g(x) $$ Since $|u_j(x)-u(x)|^p\to 0$ a.e, and $$ \int_A |u_j(x)-u(x)|^p\:d\mu\leqslant \int_A |u_j(x)|^p\:d\mu+\int_A |u(x)|^p\:d\mu\leqslant 2\int_A |g(x)|^p\:d\mu $$ where $|g(x)|^p$ is integrable. By Lebesgue's dominant convergence theorem $$ \lim_{j\to\infty}\int_A |u_j(x)-u(x)|^p\:d\mu=0 $$