The limit of a Expected Value of a Function of Sequence of Rvs

663 Views Asked by At

The question states:

Show $$\lim_\limits{n\to\infty}\mathbb{E}[g(X_n)] = g(c)$$ Where $X_n$ is a sequence of Random Variables. We discerned that $g(X_n) \rightarrow g(c)$ in probability, where $g$ is a continuous and bounded function, and also uniformly continuous on $[c-\delta, c+\delta]$. Also $X_n \rightarrow c$ in probability. The hint says, to rewrite $\mathbb{E}[g(X_n)-g(c)]$ as: $$\mathbb{E}[(g(X_n) - g(c))\mathbb{1}_{|g(X_n)-g(c)| \leq \epsilon}] + \mathbb{E}[(g(X_n) - g(c))\mathbb{1}_{|g(X_n)-g(c)| > \epsilon}]$$for all $\epsilon>0$.

The way I thought about it is if $|g(X_n) - g(c)| > \epsilon$, then we know by convergence in probability that $$\mathbb{P}(|g(X_n) - g(c)| > \epsilon) = 0, n \rightarrow \infty$$

Also, we know that $|g(X_n) - g(c)| \leq \epsilon$, which we can show by limit definition that the limit of $g(X_n)$ is $g(c)$.

Thus the expectation must be $0$, thus by linearity of Expectation we get that:

$$\mathbb{E}[g(X_n)-g(c)] = \mathbb{E}[g(X_n)] - \mathbb{E}[g(c)] = 0$$

Thus $$\lim_\limits{n\to\infty} \mathbb{E}[g(X_n)] = g(c)$$

But I don't think this is correct. If someone had any tips, it would be appreciated.