Let $\gamma_n$ be a non-negative function in $L^1(\mathbb{R})$ such that $\gamma_n = 0$ outside $[-1/n,1/n]$ and $\int_{-\infty}^\infty\gamma_n(t)dt = 1$. I want to show that $$ \lim_{n\to \infty} ||f*\gamma_n - f||_1 = 0, \quad \forall f\in L^1(\mathbb{R}), $$ which is essentially showing that the Dirac delta function is an identity for convolution in $L^1(\mathbb{R})$.
In the case where $f\in C(\mathbb{R})$ and I take an explicit form for $\gamma_n$, such as $\gamma_n = 2/n$, I can show that $\lim_{n\to \infty} \int f(\tau)\gamma_n(t-\tau)d\tau = f(t)$ pretty straightforwardly using the mean value theorem. But how can we show it in the $L^1(\mathbb{R})$ case?
As you said, this is easy for continuous functions (with compact support, say, to stay in $L^1(\mathbb R)$). But these functions are dense in $L^1(\mathbb R)$ and the operators $T_(f)= \gamma_n\ast f$ are uniformly bounded. The triangle inequality then yields for an $\varepsilon$-approximation $g$ of $f$ $$ \|T_n(f)-f\|\le \|T_n(f-g)+T_n(g)-g+(g-f)\| \le \|T_n\|\|f-g\|+\|T_n(g)-g\|+\|g-f\|\le (c+1)\varepsilon $$ for $n\to\infty$.