Let $(X_n)_n$ be a gaussian random vector taking values in $\mathbb{R}^d,$ let $K_{X_n}$ denote the covariance matrix of $X_n.$ Show that if $(X_n)_n$ converges in distribution to $X,$ then $(K_{X_n})_n$ and $(E[X_n])_n$ converge. Find the distribution of $X.$
So in terms of characteristic function we have $$\forall x \in \mathbb{R}^d, \quad\lim_n\varphi_{X_n}(x)=e^{i \ ^tx\operatorname E[X_n]-\frac{1}{2}\,^txK_{X_n}x}=\varphi_X(x),$$ how can we use it in order to prove the convergence of $(K_{X_n})_n$ and $(\operatorname E[X_n])_n.$
First observe that the vector $X$ is necessarily Gaussian: indeed, if $c_1,\dots,c_d$ are constant, the sequence of random variables $(Y_n)$ defined by $Y_n:=\sum_{i=1}^dc_iX_n^{(i)}$ converges to $\sum_{i=1}^dc_iX^{(i)}$ and the limit of a sequence of Gaussian random variables is Gaussian. Moreover, we know that if $Y_n\sim N(\mu_n,\sigma_n^2)$ and $(Y_n)$ converges in distribution to $Y$ then there exists $\mu$ and $\sigma$ such that $\mu_n\to\mu$ and $\sigma_n^2\to \sigma^2$.
Then we have to show that $E[X_n]$ converges to $E[X]$ and $E[X_n^{(i)}X_n^{(j)}]\to E[X^{(i)}X^{(j)}] $. This can be seen by using the convergence in distribution of $\left(X_n^{(i)}\right)$ to $X^{(i)}$ for the first part and for the second we use the convergences in distribution of $X_n^{(i)}-X_n^{(j)}$ to $X^{(i)}-X^{(j)}$, of $X_n^{(i)}+X_n^{(j)}$ to $X^{(i)}+X^{(j)}$ and $X_n^{(i)}\to X^{(i)}$, $X_n^{(j)}\to X^{(j)}$, combined with the convergence of the corresponding variances.