I'm struggling with this kind of problem:
I have an assumption that $f$ and $g$ are in $L^2(R)$, and I should prove that $f\star g \rightarrow 0$ when $|x| \rightarrow \infty$. I think (but I'm not sure) I should use the theorem, that says $||f \star g||_\infty \le ||f||_2||g||_2$.
I would be thankful for any help with this problem! Thanks!
It's useful to use the Fourier transform. We have $$\mathcal{F}(f\star g)= \mathcal{F} f \cdot \mathcal{F} g$$
Apply to both sides the Fourier transform and get $$\mathcal{F}\mathcal{F} (f\star g) = \mathcal{F}(\mathcal{F} f \cdot \mathcal{F} g)$$
Now use that $\mathcal{F}\circ \mathcal{F}$ is a $\ne 0$ constant times the reflection in the argument $$\mathrm{const} \cdot f\star g (-x)$$ -- for the left hand side . For the right hand side use $\mathcal{F}f$, $\mathcal{F}g$ are also functions in $L^2$ so their product is in $L^1$ and the Fourier transform of a function in $L^1$ is a continuous function that converges to $0$ at infinity (Riemann-Legesgue). We conclude that $f\star g$ converges to $0$ for $|x| \to \infty$.