I have been stack with something which is perhaps just in front of my eyes but I just cannot get it. I have a sequence $(f_n)$ in $L^2(\mathbb R)$ which converges to $f$ in the $L^2$-topology. If I assume that $\int_{\mathbb R}f_n(x)dx=0$ for all $n$, then does it follow that $\int_{\mathbb R}f(x)dx=0$? Sorry if this too obvious.
Thanks for answer.
Math
Counterexample: suppose $f_n = \chi_{[0, 1]} - \frac{1}{n} \chi_{[1, n+1]}$. Then $f_n \to \chi_{[0, 1]}$ in $L^2(\mathbb{R})$. However, $\int_{\mathbb{R}} f_n(x)\,dx = 0$ for each $n$ while $\int_{\mathbb{R}} \chi_{[0,1]}(x)\,dx = 1$.