I am asked to show that for an $L^1$ function $f$,
$$T_{\epsilon}f(x)=\frac{1}{\pi}\int_{-\infty}^{\infty}\frac{\epsilon}{y^2+\epsilon^2}f(x-y) dy$$
converges to $f(x)$ as $\epsilon\to 0^+$ for almost every $x$.
As this question is given in the context of a measure theory course, I was thinking of using the dominated convergence theorem for the sequence of functions $\displaystyle f_n(y)=\frac{1/n}{y^2+(1/n)^2}f(x-y)$ for a fixed $x$.
Firstly $f_n$ is dominated by $f_1\in L^1(\mathbb{R})$ (i.e. $|f_n|\leqslant |f_1|$.)
Also $f_n\to 0=:f$. But then, $$\lim_{n\to\infty}\int_{-\infty}^{\infty}f_n(y) dy=\int_{-\infty}^{\infty}\lim_{n\to\infty}f_n(y)dy=\int 0 dy= 0$$
which is not the desired result.
Could someone explain what I reasoned wrongly and what is the correct approach.
Thank you.
Use the convolution approach on $ L^1(\mathbb{R}) $,
$$ (f*g)(x)=\int\limits_{\mathbb{R}}f(x-t)g(t)~\text{d}t=\int\limits_{\mathbb{R}}g(x-t)f(t)~\text{d}t. $$ Then, we can write
$$ T_{\epsilon}f(x)=\dfrac{1}{\pi}\int\limits_{\mathbb{R}}f(x-y)g(y)~\text{d}y, $$
where,
$$ g(y)=\dfrac{\epsilon}{y^2+\epsilon^2}. $$