Consider a random variable $X$ defined on the probability space $(\Omega, \mathcal{F}, P)$ such that $X:\Omega\rightarrow \mathbb{R}$.
Suppose that $X\sim N(\mu, \sigma^2)$.
Consider a random function $T(X):\mathbb{R}\rightarrow \mathbb{R}$.
Let $\{Y_n\}$ be a sequence of random variables all defined on the probability space $(\Omega_n, \mathcal{F}_n, P_n)$ such that $Y_ n:\Omega_n\rightarrow \mathbb{R}$ $\forall n$.
Assume $Y_n=O_{P_n}(1)$ and $Y_n\rightarrow_d T(X)$ as $n\rightarrow \infty$ where the meaning of $O_{P_n}(\cdot)$ is described here.
Can I conclude that $\lim_{n\rightarrow \infty}E_{P_n}(Y_n)=E_P(T(X))$? In negative case, which additional assumptions would be sufficient?
This question is from proof of Theorem 15.1 in van der Vaart "Asymptotic Statistics" p.216 when the author writes "Because $\phi_n$ are unformly bounded $E_h\phi_n\rightarrow E_hT$".
It does not really matter that $Y_n$ are defined on different probability spaces: all your assumptions and statements are related to their distribution on $\mathbb{R}$. So you can freely assume that everything is defined on the same probability space.
The normal distribution of $X$ is irrelevant too, since $T(X)$ can have arbitrary distribution.
The assumption $Y_n = O_{P}(1)$ follows from the weak convergence thanks to the Prokhorov theorem.
You want to deduce the convergence of expectations from weak convergence. This is not always possible. There are some extra sufficient conditions:
uniform integrability: $\sup_n E[Y_n \mathbf{1}_{|Y_n|>C}] \to 0$, $C\to\infty$. This is the strongest one but usually not very convenient;
de la Vallée-Poussin condition: for some function $V:[0,+\infty)\to[0,+\infty)$ such that $V(x)/x\to +\infty$, $x\to+\infty$, it holds $\sup_n E[V(|Y_n|)]<\infty$, e.g. $\sup_n E[|Y_n|^{1+\varepsilon}]<\infty$ for some $\varepsilon>0$.