Prove that if $f\in L^1([0,1],\lambda)$ is not constant almost everywhere then there exists an interval so that $\int_I\!f\,\mathrm{d}\lambda\neq 0$. Here $\lambda$ is the Lebesgue measure.
Since this is obviously true for continuous functions, I've been trying to use the fact that continuous functions with compact support are dense in $L^1$, but I'm not sure how to set it up.
Suppose you had an $f$ with $\int_I f d \lambda = 0$ for all. Let $f_+$ and $f_-$ be the positive and negative parts, i.e. $f = f_+ - f_-$. Now, notice the following:
1) They both induce measures defined by $\mu_+ (A) = \int_A f_+ d \lambda$, similarly for $\mu_{-}$
2) We can restate the given condition as saying that these measures are equivalent on the ring of intervals (i.e. the set of finite union of intervals)
3) Since $f \in L^1$, these are finite measures, in particular they are $\sigma$-finite.
4) By Catheodory extension, these measures must be equal for the Borel sigma algebra.
If $f$ is not equally $0$ a.e., then wlog there exists an $A$ such that $f > 0$ on this set with $\lambda(A) > 0$. But, then $\mu_+(A) > 0 = \mu_-(A)$, which is a contradiction.