Let $a_n$ be a sequence and $N\in\mathbb{N}$ be fixed. Assume that no more than $N$ of the $a_n$'s lie in any interval of lenght $1$. Show that for any $f\in L^1(\mathbb{R})$, we have $\lim_nf(x+a_n)=0$ a.e. x.
How to approach this problem? Clearly, the convergence does not have to be uniform, or even pointwise, since $f$ need not tend to $0$ at infinity.
We prove, more strongly, that the series $\sum_n f(x + a_n)$ is absolutely convergent for almost all $x$, and that the sum function belongs to $L^1(I)$ for any bounded interval $I$.
We may assume $f \geq 0$ by replacing $f$ with $|f|$, if necessary.
Now let $F(x) = \sum_n f(x + a_n)$. This function takes its values in $[0,+\infty]$. We will prove that for any $b \in \mathbf{R}$, we have $F \in L^1([b,b+1])$. This will be enough to prove the desired result, since it will show that $F(x)$ is finite for almost all $x \in [b,b+1]$, and in fact (by letting $b$ take all values in $\mathbf{Z}$), for almost all $x \in \mathbf{R}$.
To prove that $F \in L^1([b,b+1])$, it is enough by the monotone convergence theorem to check that the series $\sum_n \int_{b}^{b+1} f(x + a_n) \, dx$ converges. But $$ \sum_n \int_{b}^{b+1} f(x + a_n) \, dx = \sum_n \int_{a_n + b}^{a_n + b + 1} f(x) \, dx = \sum_n \int_{\mathbf{R}} \chi_{[a_n + b, a_n + b + 1]} f = \int_{\mathbf{R}} \sum_n \chi_{[a_n + b, a_n + b + 1]} f. $$
By the hypothesis, any fixed $x \in \mathbf{R}$ can belong to the interval $[a_n + b, a_n + b + 1]$ for at most $N$ values of $n$. Therefore the integrand in the last integral above is in fact bounded by $Nf \in L^1(\mathbf{R})$. Therefore the last integral exists, which is enough to prove what we wanted.