Suppose we have a decreasing sequence $\{\alpha_i\}_{i=1}^{\infty}$ where $\alpha_i \in \mathbb{R}^{+}$. That is
$$\mathbb{R}^{+} \ni \alpha_1 \geq \alpha_2 \geq \cdots > 0.$$
Further, let $\{X_i\}_{i=1}^{\infty} \sim N(0,1)$, and define $Y_i := \alpha_iX_i \sim N(0, \alpha_i^2)$. What conditions does the sequence $\{\alpha_i\}_{i=1}^{\infty}$ need such that $Y_i \stackrel{a.s.}{\to} 0$?
What I have:
1. For each $i$ we want to show that for every $\epsilon_i > 0$ there exists some constant $c_i$ such that
$$\mathbb{P}(|a_iX_i| > c_i) = \mathbb{P}(|Y_i| > c_i) \leq \epsilon_i. \tag{$\star$}$$
So, I need to find some systematic way of choosing $\epsilon_i$ and $\alpha_i$ such that for any $\delta > 0$ we have $\mathbb{P}(|Y_i| > \delta) \to 0$. I've been focusing on a method to incorporate the variances $\alpha_i^2$, which seems important in choosing $c_i$, i.e. the larger the variances, the flatter the pdf of $Y_i$, the larger $c_i$ needs to be to satisfy $(\star)$. I'm not seeing how to easily do this?
2. Once we have (1) satisfied, then we can apply the Borel-Cantelli Lemma to show that for all $\epsilon_i > 0$ we have
$$\sum_{i \geq 1} \mathbb{P}(|Y_i| > \epsilon_i) < \infty \quad\Rightarrow\quad Y_i \stackrel{a.s.}{\to} 0,$$
which in turn answers the question of how to choose sufficient $\{\alpha_i\}_{i=1}^{\infty}$.
$P(|\alpha_iX_i| >\epsilon)= P(|X_i| >\frac {\epsilon} {\alpha_y})\leq \frac {EX_i^{2}} {(\frac {\epsilon} {\alpha_y})^{2}}=\alpha_i^{2}/\epsilon^{2}$. So $\sum \alpha_i^{2} <\infty$ is a sufficient condition (by Borel Cantelli Lemma).