Assume that $X_{n, m}$ are double indexed sequence of bounded random variables, with $$ \forall \varepsilon > 0 \quad \lim_{m\to \infty}\sup_{n\geq 1} \mathbb P(|X_{n, m}-Y_n| > \varepsilon) = 0 $$ where $Y_n\in L^4$ is sequence of random variables that converges in distribution $Y_n \overset D \to N(0, 1)$. First condition is form of convergence in probability $X_{n, m} \overset {\mathbb P} {\underset {m \to \infty} \longrightarrow} Y_n$, but uniform for all $n$.
Can we prove that $$ \lim_{m\to\infty}\lim_{n \to \infty} \mathbb E X_{n, m}^4 = 3? $$ 3 is fourth moment of $N(0, 1)$.
Context: I'm trying to proof converse of fourth moment phenomenon for certain type of random variables. I managed to proof that $$ \lim_{m \to \infty}\lim_{n\to \infty} \mathbb E Y_n^4 - \mathbb E X_{n, m}^4 = 0, $$ so mentioned limit would imply $\lim_{n\to\infty} \mathbb E Y_n^4 = 3$ which I want to achieve. Variables $X_{n, m}$ are certain truncation of $Y_n$, for which moment generating functions are finite on some (common for fixed $m$) neighborhood of $0$.
As far as I know, uniform integrablility is needed for such tasks. But I've never came across with such uniform convergence in probability, and it sparked hope that it might be enough to prove something like "$X_{\infty, m}$ are bounded and converge in distribution to $N(0, 1)$", which perhaps could be helpful. But even it is a bit problematic, as it seems $\lim_{n\to \infty} X_{n, m}$ does not need to converge in any sense. Maybe one could find subsequence that would converge for all $m$ (that also seems interesting), but I'm unable to.