Commonly we use Central Limit Theorem in the following form.
Theorem: Let $\{X_i\}_{i=1}^\infty$ be a sequence of i.i.d. random variables such that $E(X_i)=\mu$, $\operatorname{Var}(X_i)=\sigma^2<\infty$. Then,
$$\sqrt{n}\frac{\overline{X}_n-\mu}{\sigma}\stackrel{d}{\longrightarrow}\mathcal{N}(0,1), \tag{1}$$
where $\overline{X}_n = \frac{1}{n}\sum_{i=1}^n X_i$.
I need to show that expression $(1)$ can be replaced by expression $(2)$:
$$\overline{X}_n - \mu \overset{d}{\longrightarrow}\mathcal{N}(0,\frac{\sigma^2}{n}). \tag{2}$$
I tried to use
Slutsky theorem: Let $\{A_n\}$ and $\{B_n\}$ be sequences of random variables. If $A_n \overset{d}{\longrightarrow} A$ and $B_n \overset{P}{\longrightarrow} c$ then $A_n \cdot B_n \overset{d}{\longrightarrow} cA\,$ (and $A_n + B_n \overset{d}{\longrightarrow} A + c$).
If we define $A_n = \sqrt{n}\frac{\overline{X}_n-\mu}{\sigma}, \, B_n = \sigma, \, \forall n$, then by Slutsky theorem we have
$$A_n \cdot B_n = \sqrt{n} (\overline{X}_n-\mu) \overset{d}{\longrightarrow} \sigma \cdot \mathcal{N}(0,1) \equiv \mathcal{N}(0,\sigma^2). \tag{3}$$
Next, I wanted to show that $(3)$ implies $(2)$. I tried to use Slutsky theorem one more time for sequences $\widetilde{A_n} = \sqrt{n} (\overline{X}_n-\mu), \, \widetilde{B}_n = \frac{1}{\sqrt{n}}$: $$\widetilde{A}_n \cdot \widetilde{B}_n = \overline{X}_n-\mu \overset{d}{\longrightarrow} 0 \cdot \mathcal{N}(0, \sigma^2).$$
As you see, direct usage of the Slutsky theorem give zero as a result because $\widetilde{B}_n \overset{P}{\longrightarrow} 0$.
So how to show correctly that expression $(3)$ implies $(2)$ (or, equivalently, $(1)$ implies $(2)$)?
Maybe it is easier to proof this without using Slutsky theorem?