Let $\{X_n\}_{n=1}^\infty$ be a sequence of random variables. Assume we have another sequence of random variables $\{Y_n\}_{n=1}^\infty$ such that $X_n-Y_n\longrightarrow 0$ (in probability). Furthermore, we know that the moments of $Y_n$ converge to the moments of $N(0,1)$ (this implies that $Y_n$ converges in distribution to $N(0,1)$). By Slutsky's theorem, we would have: $X_n=Y_n+(X_n-Y_n)$ converges in distribution to $N(0,1)$.
I was wondering if I could say anything about the moments of $X_n$? Meaning, can I say that the moments converge to the moments of $N(0,1)$ or no necessarily?
If I knew my $X_n$ were subgaussian, then I know the answer would be yes, but I dont know whether that I actually have subgaussian.
No. As you said, you need an additional integrability condition.
In general, the moments of $X_n$ may not even exist. Here's an example.
Let $Y=Y_1=Y_2=\dots$ be equal to standard Gaussian.
Let $$ X_n = \begin{cases} Y & |Y|<n \\ e^{Y^2} & |Y|\ge n\end{cases}$$
Now $P(|X_n - Y|>0) \le P(|Y|>n)\to0$, so we have convergence in probability.
In fact $X_n \to Y$ a.s.
However
$$\begin{align*} E |X_n| &= E [|Y| ,|Y|<n] + E [ e^{Y^2},|Y|>n] \\ & \ge \frac{2}{\sqrt{2\pi}}\int_n^\infty e^{x^2} e^{-x^2/2} dx \\ & = \infty. \end{align*} $$