I have been thinking on and off about a problem for some time now. It is inspired by an exam problem which I solved but I wanted to find an alternative solution. The object was to prove that some sequence of functions converges weakly to zero in $L^2$.
I managed to show (with some help) that the limit $f$ (of a subsequence) satisfies $\int_0^x f \ dm=0$ for all $x>0 $. From this I want to conclude that $f=0$ a.e. I can do this with the fundamental theorem of calculus in its Lebesgue version but there ought to be a more elementary proof.
Can someone here help me out?
Indeed, as you expected, a simple proof of the result can be found; see Theorem 2.1 in this useful note on absolutely continuous functions.
EDIT: Since this is a quite important result, it is worth giving here the proof in detail. The proof below is essentially the one given in the link above, but somewhat shorter.
Theorem. If $f$ is integrable on $[a, b]$ and $\int_a^x {f(t) dt} = 0$ $\forall x \in [a,b]$, then $f = 0$ a.e. on $[a, b]$.
Proof. An open subset $O$ of $[a,b]$ is a countable union of disjoint open intervals $(c_n, d_n)$; hence, $$ \int_O {f(t) dt} = \sum\limits_{n = 1}^\infty {\int_{c_n }^{d_n } {f(t) dt} } = 0 $$.
If $K$ is a closed subset of $[a,b]$, then $$ \int_K {f(t) dt} = \int_a^b f(t)dt - \int_{(a, b) \setminus K} f(t)dt = 0 - 0 = 0, $$
since $ (a, b) \setminus K $ is open.
Next let $E_ + = \{ x \in [a,b]:f(x) > 0\}$ and $E_ - = \{ x \in [a,b]:f(x) < 0\}$. If $\lambda(E_+) > 0$, then there exists some closed set $K \subset E_+$ such that $\lambda(K) > 0$. But $\int_K {f(t){\rm d}t} = 0$, hence $f=0$ a.e. on $K$. This contradiction shows that $\lambda(E_+) = 0$. Similarly, $\lambda(E_-) = 0$. The theorem is thus established.