Let $X_n\sim N(0,1/n)$. Is there a continuous function $f$ such that
- $E[f^2(X_n)]<\infty$
- $\lim_{n\rightarrow \infty} E[f^2(X_n)] \neq f^2(0)$?
Also, what would happen if I add the condition
- $E[|f(X)f(Y)|]<\infty$ for all jointly normal $X,Y$ such that $EX=EY=0$
I know that there is no such $f$ if we additionally require $f$ to be bounded since $X_n \stackrel{d}{\rightarrow}\delta_0$.
However, I am totally clueless when it comes to proving (or disproving) the existence of such $f$ if we drop out boundedness condition.
I appreciate every hint!
No, there doesn't exist such a function $f$.
For the proof of the statement we use the following auxiliary result.
Proof of the lemma: If we set $R_n := \sqrt{\log(n)/(n-1)}$ for $n \geq 2$, then a straight-forward computation shows that $$\exp \left(- \frac{y^2}{2} (n-1) \right) \leq \frac{1}{\sqrt{n}} \quad \text{for all $|y| \geq R_n$}.$$ (Equality holds for $|y|=R_n$ and the monotonicity of the function of left-hand side then gives the desired inequality for $|y| \geq R_n$.) Hence, $$\exp \left(-n \frac{y^2}{2} \right) \leq \frac{1}{\sqrt{n}} \exp \left(-\frac{y^2}{2} \right) \quad \text{for all $|y| \geq R_n$.}$$ Since $R_n \to 0$ as $n \to \infty$, we have $R_0 := \sup_{n \geq 2} R_n$ and $$\exp \left(-n \frac{y^2}{2} \right) \leq \frac{1}{\sqrt{n}} \exp \left(-\frac{y^2}{2} \right) \quad \text{for all $|y| \geq R_0$, $n \in \mathbb{N}$.} \tag{1}$$ (Note that $(1)$ is trivially satisfied for $n=1$.) Hence, \begin{align*} \mathbb{E}(g(X_n) 1_{\{|X_n| \geq R\}}) &= \sqrt{\frac{n}{2\pi}} \int_{|y| \geq R} g(y) \exp \left(-n \frac{y^2}{2} \right) \,d y \\ &\leq \frac{1}{\sqrt{2\pi}} \int_{|y| \geq R} g(y) \exp \left(- \frac{y^2}{2} \right) \, dy \\ &= \mathbb{E}(g(X_1) 1_{\{|X_1| \geq R\}}) \tag{2} \end{align*} for all $R \geq R_0$. By assumption, $\mathbb{E}g(X_1)<\infty$, and therefore the monotone convergence theorem yields that $$\sup_{n \in \mathbb{N}} \mathbb{E}(g(X_n) 1_{\{|X_n| \geq R\}}) \stackrel{(2)}{\leq} \mathbb{E}(g(X_1) 1_{\{|X_1| \geq R\}}) \xrightarrow[]{R \to \infty} 0.$$
Proof of the theorem: Fix $\epsilon>0$. According to the above lemma, we can choose $R>0$ such that
$$\sup_{n \in \mathbb{N}} \mathbb{E}(g(X_n) 1_{\{|X_n| \geq R\}}) \leq \epsilon.$$
Without loss of generality, we may assume that $R$ is sufficiently large such that
$$\sup_{n \in \mathbb{N}} \mathbb{P}(|X_n| \geq R) \leq \epsilon.$$ By the triangle inequality, this implies that
$$\begin{align*} \mathbb{E}(|g(X_n)-g(0)|) &\leq \mathbb{E}(|g(X_n)-g(0)| 1_{\{|X_n| \leq R\}}) + \mathbb{E}(|g(X_n)-g(0)| 1_{\{|X_n| > R\}}) \\ &\leq \mathbb{E}(|g(X_n)-g(0)| 1_{\{|X_n| \leq R\}}) + 2 \epsilon. \end{align*}$$
The random variable $X_n \sim N(0,1/n)$ equals in distribution $U/\sqrt{n}$ for any $U \sim N(0,1)$. Hence,
$$\mathbb{E}(|g(X_n)-g(0)|) \leq \mathbb{E} \left[ \left| g \left( \frac{U}{\sqrt{n}} \right) - g(0) \right| 1_{|U|/\sqrt{n} \leq R } \right]+ 2 \epsilon \tag{3}$$
Noting that
$$ \left| g \left( \frac{U}{\sqrt{n}} \right) - g(0) \right| 1_{|U|/\sqrt{n} \leq R } \leq 2 \sup_{|y| \leq R} |g(y)| < \infty$$
and
$$ \left| g \left( \frac{U}{\sqrt{n}} \right) - g(0) \right| 1_{|U|/\sqrt{n} \leq R } \xrightarrow[]{n \to \infty} 0$$
by the continuity of $g$, we conclude from the dominated convergence theorem that
$$\lim_{n \to \infty} \mathbb{E} \left[ \left| g \left( \frac{U}{\sqrt{n}} \right) - g(0) \right| 1_{|U|/\sqrt{n} \leq R } \right] = 0,$$
and so $(3)$ gives
$$\lim_{n \to \infty} \mathbb{E}(|g(X_n)-g(0)|) =0.$$
This implies, in particular, that
$$|\mathbb{E}g(X_n)-g(0)| \leq \mathbb{E}(|g(X_n)-g(0)|) \to 0,$$
i.e. $\mathbb{E}g(X_n) \to g(0)$.