Sorry if this has been answered, I could not find the solution anywhere.
Suppose $f,g \in L^2(\mathbb{R})$. Show that the convolution $f \star g$ defined via
$$f \star g(x):=\int_{-\infty}^\infty f(y)g(x-y) dy$$
is continuous. So I use the $\epsilon - \delta$ definition of continuity since I can bound both $\int \vert f(x) \vert^2 dx$ and $\int \vert g(x) \vert^2 dx$ by saying they're finite then squaring both sides to undo the square root that comes with the $L^2$ norm. If this exact question has been answered, please just reference me the link and I apologize for repeat questions, you guys have to remember as time goes on there are newer and newer students coming into the field, myself included, who have not seen these problems on here in the past nor have seen the tricks necessary to attack some of these proofs, and you have to remember most of us who ask are not seasoned mathematicians but rather aspiring mathematicians so when my question gets closed due to duplicate or if you guys think its a "dumb" question it really discourages me from wanting to learn more of this beautiful field we call mathematics. Thanks in advance.
So do I take $\epsilon >0$ to be arbitrary and want to find $\delta$ such that whenever $\vert x - x_0 \vert < \delta$, then $\vert f \star g(x)- f \star g(x_0) \vert < \epsilon$ for all $x_0 \in \mathbb{R}$? Because (as in most analysis proofs) I need to work backwards to find delta, so I can unpack the definition of $\vert f \star g(x)- f \star g(x_0) \vert$ by using the definition. Am I headed the right direction? I can then apply the triangle inequality as well, correct?
Cauchy Schwarz gives $$|f\ast g(x)-f\ast g(x+\epsilon)|\le \|f\|_{L^2}\|g(.+\epsilon)-g\|_{L^2}$$
It remains to show that $$\lim_{\epsilon \to 0} \|g(.+\epsilon)-g\|_{L^2}=0$$ from the definition of $L^2$ and the Lebsgue measure/sigma-algebra. Note that it also follows easily from the density of $C^0_c$ in $L^2$.