I have the following situation: $f_n$ and $g$ are (Lebesgue) integrable functions on a interval $[a,b]$ and as $n \to \infty$, $\int_{[a,b]} |f_n| dx$ goes to zero. I want to prove that then also $\int_{[a,b]} |g||f_n|dx $ goes to zero as $n$ approaches infinity.
I thought that maybe there is a way to get a bound of the form $\int_{[a,b]} |f_n(x)| |g(x)| dx < M \int |f_n(x)|dx $ for some constant $M$, but I am not sure how to prove this.
One can apply Cauchy-Schwarz but then one has to show that
- $\int_{[a,b]} |f_n|^2 dx$ goes to zero as n approaches infinity
- $\int_{[a,b]} |g|^2 dx$ is finite.
I think the first one is fairly clear, because for $n$ large enough $|f_n|$ is smaller than $1$ almost everywhere. But I am not sure how to show the second claim.
In the general case, this is false. $g(x)=\frac 1 {\sqrt x}$, $f_n(x)=\frac 1 {n \sqrt x}$, over $[0,1]$. Here, $f_ng$ is not even integrable !
You need to assume both your bullets to use Cauchy-Schwarz and get the claim. You claim that the first bullet is fairly clear, that's not true, a sequence of functions whose integral goes to 0 isn't smaller than 1 almost everywhere for $n$ large enough (once again see my example)