So I am pretty sure that in a one-dimensional case, we would just say $x \overset{d}{\to} N(0,\sigma^2)$ and $s^2 \overset{p}{\to} \sigma^2$ so $\frac{s}{\sigma} \frac{x}{s} \overset{d}{\to} N(0,1)$ and the first fraction converges to 1 so $x \overset{d}{\to} N(0,s^2)$ from Cramer Slutsky theorem. Trying to do this with a multivariate distribution, I used the Cholesky decomposition to say the variance matrix has a "square root" (because a variance matrix by definition has to be positive semidefinite) and applied the same principle: $X \overset{d}{\to} N(0,\Sigma)$ and $S \overset{p}{\to} \Sigma$, we get $\Sigma^{1/2} X \overset{d}{\to} N(0,I)$ and $S^{1/2} S^{-1/2} \Sigma^{1/2} X \overset{d}{\to} N(0,I)$.
My issue, however, is I don't actually know if $S^{-1/2} \Sigma^{1/2} \overset{p}{\to} I$ necessarily holds. So is this true? If not, how else could I show that asymptotic distribution holds when we use the consistent variance estimator?