Suppose that $Z$ is a random vector whose range is compact. Let $f,f_n$ be functions of the random vector $Z$. I want to investigate the $L^2$ norm between $f$ and $f_n$ under 2 different probability distributions on $Z$.
Suppose that the two distributions we consider have densities $p_1(z)$ (uniform over the domain of $Z$) and $p_2(z)$. If $\|f_n(z)-f(z)\|_{L^2} \rightarrow 0$ under the density $p_1(z)$, does it imply that $\|f_n(z)-f(z)\|_{L^2} \rightarrow 0$ under the density $p_2(z)$?
I tried using Cauchy-schwarz on $\int |f_n(z)-f(z)|^2 p_2(z) \,dz$ but I can't simplify it further.
I am inclined to believe this is false in general. Are there any well known examples even in real analysis?
Let $Z_1$ have uniform distribution on $(0,1)$ and $Z_2$ have density $\frac 1 {2\sqrt x}$ on $(0,1)$. Let $f_n(x)=n^{1/4} I_{(0,\frac 1 n)}$. Then $f_n (Z_1) \to 0$ in $L^{2}$ but $f_n (Z_2)$ does not tend to $0$ in $L^{2}$