Let $X_n$ be a sequence of Gaussian random variables defined on the same probability space.
The statement is that if $X_n$ converges to some random variable $X$ in $L^2$-sense, then $X$ is also Gaussian.
I think it is possible to do with Characteristic functions; but I am curious what is the simplest way to see it.
Indeed, this can be deduced using the characteristic functions, and the assumption of convergence in $\mathbb L^2$ can be relaxed to convergence in distribution.
First, we assume that $\mathbb E[X_n]=0$ and $\mathbb E[X_n^2]=1$ for each $n$.
Then we have to show that $X$ is standard normal. To this aim, notice that for a fixed $\varepsilon$, the inequality $$\mathbb P\{X\leqslant t\}\leqslant \mathbb P\{X_n\leqslant t+\varepsilon \} +\mathbb P\{|X_n -X|\gt \varepsilon\} =\mathbb P\{\mathcal N\leqslant t+\varepsilon\}+ \mathbb P\{|X_n -X|\gt \varepsilon\},$$ where $\mathcal N$ denotes a random variable whose distribution is standard normal.
By the $\mathbb L^2$ convergence, the last term converges to $0$ and since $\varepsilon$ is arbitrary, we obtained $$\mathbb P (X\leqslant t)\leqslant\mathbb P\{\mathcal N\leqslant t\}.$$ In a similar way, we deduce that $$\mathbb P (X\geqslant t)\leqslant\mathbb P\{\mathcal N\geqslant t\}.$$ Since $\mathbb P\{\mathcal N=t\}=0$ for each $t$, we derive $$\mathbb P\{X \leqslant t\}\leqslant \mathbb P\{\mathcal N\leqslant t\} =1-\mathbb P\{\mathcal N\geqslant t\} \leqslant 1-\mathbb P\{X\geqslant t\} \leqslant \mathbb P\{X\leqslant t\},$$ hence $X$ is standard normal.
In general, $\mathbb E[X_n]=m_n$ and $\mathbb E[X_n^2]= \sigma_n^2$. Since $X_n\to X$ in $\mathbb L^1$, the sequence $(m_n)_{ n\geqslant 1}$ is convergent (say to $m$), and by the convergence in $\mathbb L^2$, we derive the convergence of the sequence $(\sigma_n^2)_{ n\geqslant 1}$ to some $\sigma^2$. We apply the previous case to $X'_n:=(X_n-m_n)/\sigma_n$ and $X:(X-m)/\sigma$.