Let $f\in L^1(\mathbb{R})$. Show that $$\lim_{n\to \infty}\sum_{k=-n^2}^{n^2}\left|\int_{k/n}^{(k+1)/n}f(x)dx\right| = \int_{-\infty}^\infty |f(x)|\ dx.$$
Attempt at Solution: I figure approximating $f$ in the $L^1$ norm by a continuous function with compact support will be helpful, so let's take $f\in C_0(\mathbb{R})$. If $f$ doesn't change sign over an interval $I$, then $|\int_I f\ dx| = \int_I|f|\ dx$. If $f$ is continuous on a compact set, I want to say that $f$ changes sign only finitely often there. Basically, we only pick up errors on the LHS when we integrate over a region where $f$ changes sign. Our regions get smaller as $n\to \infty$ and the continuity of $f$ will ensure that our errors get smaller in the limit. I'm just not sure how to make this rigorous.
I would make this a comment if I had enough rep. I'll be happy to delete it if you want. For continuous compactly supported $f$, use uniform continuity to ensure that for $n\geq N_\epsilon$, the errors (i.e. sign changes of $f$) occur only in intervals where $|f|$ is no greater than $\epsilon$. Then each error term is bounded by $\epsilon\cdot 1/n$ and you have $O(n)$ of them (since we're working in a bounded interval). So the error terms contribute only $\epsilon$ to the LHS and $\epsilon$ was arbitrary.