I was thinking that if the function $H_n$ of cumulative distribution converges to a distribution $H$, then $\epsilon_n$ should converge to $\epsilon$ what could be expressed as follows:
If $H_n$ is the normal distribution with mean $\epsilon_n$ and variance $\sigma_n^2$, then $H_n$ tends to the normal distribution H with mean $\epsilon$ and variance $\sigma^2>0$ if and only if
$\epsilon_n \to \epsilon$ and $\sigma_n^2 \to \sigma^2$.
How could this defense be demonstrated?
A sequence of random variables $H_n$ defined on $\mathbb{R}$ converge in distribution to $H$ (denoted $H_n \Rightarrow H$) if and only if the sequence of cumulative distribution functions (c.d.f.s) $F_n(x)$ converge pointwise to the c.d.f. $F$ of $H$ at all points of continuity of $F$.
If $\epsilon_n \rightarrow \epsilon$ and $\sigma_n \rightarrow \sigma$, then $$\lim_{n\rightarrow\infty}F_n(x) = \lim_{n\rightarrow\infty} \int_{-\infty}^x \frac{e^{-(y-\epsilon_n)^2 / 2\sigma_n^2}}{\sqrt{2 \pi\sigma_n^2}} dy = \int_{-\infty}^x \frac{e^{-(y-\epsilon)^2}}{\sqrt{2 \pi \sigma^2}} dy = F(x), $$ where the interchange of limit and integral is permitted because the integrands converge uniformly. Thus $F_n(x) \rightarrow F(x)$ for all $x$, so $H_n \Rightarrow H$.
Conversely, if $H_n \Rightarrow H$, since $F$ is continuous everywhere for a Gaussian we have $F_n(x) \rightarrow F(x)$ for all $x$. Moreover, if $F_{\epsilon, \sigma}$ is the c.d.f. of a Gaussian with mean $\epsilon$ and variance $\sigma^2$, then $F_{\epsilon_1,\sigma_1}(x) = F_{\epsilon_2,\sigma_2}(x)$ for all $x$ if and only if $\epsilon_1 = \epsilon_2$ and $\sigma_1 = \sigma_2$. Since limits in distribution are unique, it must be that $\lim_{n\rightarrow\infty} \epsilon_n$ and $\lim_{n\rightarrow\infty} \sigma_n = \sigma$.