Let $(\Omega,\mathcal{A})$ be a finite measure space, and let $\{g_n\}$ be a sequence in $L^2(\Omega,\mathcal{A},\mu)$ converging to $g$ in the $L^2$-norm. Show that if $\int g_n\,d\mu=0$ for all $n$, then $\int g \, d\mu=0$.
First of all, I don't really know if this is true, but I have reduced another exercise to this question and now I don't see how to proceed any further. Any hints (for proving or disproving) would be appreciated.
True for the edited version.
One can assume that $\mu$ is a unitary measure since one can divide by $\mu(\Omega)$ and obtain a new measure where all norms will be multiplied by $\dfrac{1}{\mu(\Omega)}.$ So, I shall assume $\mu$ is unitary.
In this case, a direct application of Holder inequality allows concluding at once $$\|f\|_1 \leq \|f\|_2 \|1\|_2 = \|f\|_2;$$ by translation invariance of the norm, convergence in $\mathscr{L}^2$ implies convergence in $\mathscr{L}^1.$ (Plus, if you divide a function $f$ in $\mathscr{L}^2$ as $|f| = |f|\mathbf{1}_{\{|f|\leq 1\}}+|f|\mathbf{1}_{\{|f|>1\}} \leq |f|^2+1,$ once gets that $\mathscr{L}^2 \subset \mathscr{L}^1$).