From convergence in $L^2$ to convergence of the squares in $L^1$

985 Views Asked by At

Given a function $f\in L^2(X,\mu)$ for some measure space $(X,\mu)$, we of course have that $|f|^2$ is a positive function in $L^1(X,\mu)$. I now came across the following. Suppose $\int_X |f_n(x)-g_n(x)|^2 d\mu(x)\rightarrow 0$ for sequences $f_n,g_n\in L^2$, then we also have $\int_X \big| \,|f_n(x)|^2-|g_n(x)|^2\big| d\mu(x)\rightarrow 0$. Can anybody give an idea on how to prove this?

Edit: We also assume the sequences are bounded.

1

There are 1 best solutions below

5
On BEST ANSWER

$$\int \left| |f_n|^2 - |g_n|^2 \right| = \int \left| |f_n| - |g_n| \right| \left| |f_n| + |g_n| \right| \le \int |f_n - g_n| (|f_n| + |g_n|) \le \sqrt{\int |f_n- g_n|^2} \sqrt{\int (|f_n| + |g_n|)^2} \le \sqrt{\int |f_n- g_n|^2} \left(\sqrt{\int |f_n|^2} + \sqrt{\int |g_n|^2}\right)$$

Where I used Cauchy-Schwarz and Minkowski's inequalities.