When I read Lebesgue differentiation theorem, I suddenly have the following conjecture, which I can't prove or find a counterexample.
Let $f\in L_{\mathrm{loc}}^1(\mathbb{R}^n)$. If $$ \int_{B_r(x)} f(y)dy=0 $$ holds for any $r\geq 1$ and $x\in \mathbb{R}^n$, then can we say that $f(x)=0$ a.e. ? Please be careful that $r$ is larger than 1, which prevents us from taking advantage of Lebesgue differentiation theorem. When $n=1$, this seems to be true.
For $n=1$ it is true.
Proof: Let $x\in\mathbb{R}$ and $r>0$. Then $$0=\int_{x+r-2R}^{x+r}f(y)dy - \int_{x-r-2R}^{x-r}f(y)dy = \int_{x-r}^{x+r}f(y)dy - \int_{x-r-2R}^{x+r-2R}f(y)dy$$ for all $R\ge1$. By taking different choices of $R$, it follows immediately that $$\int_{x'-r}^{x'+r}f(y)dy$$ is independent of $x'\in\mathbb{R}$, and therefore it must be $0$ by the assumption. The statement follows.
I don't know if a similar proof can work for $n\ge2$. If we were able to work with squares instead of balls, then it should, but with balls we get "moons" which are difficult to compare.