Show that if $g_n$ converges to $g$ in $L^2$-norm and $\int g_n \,d\mu=0$ for all $n$, then $\int g \, d\mu=0$

113 Views Asked by At

Let $(\Omega,\mathcal{A})$ be a finite measure space, and let $\{g_n\}$ be a sequence in $L^2(\Omega,\mathcal{A},\mu)$ converging to $g$ in the $L^2$-norm. Show that if $\int g_n\,d\mu=0$ for all $n$, then $\int g \, d\mu=0$.

First of all, I don't really know if this is true, but I have reduced another exercise to this question and now I don't see how to proceed any further. Any hints (for proving or disproving) would be appreciated.

2

There are 2 best solutions below

0
On BEST ANSWER

True for the edited version.

One can assume that $\mu$ is a unitary measure since one can divide by $\mu(\Omega)$ and obtain a new measure where all norms will be multiplied by $\dfrac{1}{\mu(\Omega)}.$ So, I shall assume $\mu$ is unitary.

In this case, a direct application of Holder inequality allows concluding at once $$\|f\|_1 \leq \|f\|_2 \|1\|_2 = \|f\|_2;$$ by translation invariance of the norm, convergence in $\mathscr{L}^2$ implies convergence in $\mathscr{L}^1.$ (Plus, if you divide a function $f$ in $\mathscr{L}^2$ as $|f| = |f|\mathbf{1}_{\{|f|\leq 1\}}+|f|\mathbf{1}_{\{|f|>1\}} \leq |f|^2+1,$ once gets that $\mathscr{L}^2 \subset \mathscr{L}^1$).

2
On

False for original version.

Consider $\displaystyle g_n = \sum_{k = 0}^n \dfrac{1}{k} \left( \mathbf{1}_{[k, k + \frac{1}{2})} - \mathbf{1}_{[k + \frac{1}{2}, k + 1)} \right).$ Then, $g_n$ has integral zero, converges in $\mathscr{L}^2$ sense to $\displaystyle g = \sum_{k = 0}^\infty \dfrac{1}{k} \left( \mathbf{1}_{[k, k + \frac{1}{2})} - \mathbf{1}_{[k + \frac{1}{2}, k + 1)} \right)$ but $g$ does not even belong to $\mathscr{L}^1,$ hence its integral does not exists.