I have the following sequence of independent r.v. $$\mathbb{P}(\{X=n\})=\big(\frac{1}{n^2}\big), \mathbb{P}(\{X=0\})=\big(1-\frac{1}{n^2}\big)$$ and also, $S_n = \sum_{i=1}^{n} X_i$. I am denoting with $H_n = \sum_{i=1}^n \frac{1}{i}$ and $K_n = \sum_{i=1}^n \frac{1}{i^2}.$
I am trying to find if the sequence $\frac{S_n}{n}$ converges to a constant $c_1$ in probability and whether $\frac{S_n}{nH_n}$ converges to a constant $c_2$ but almost surely. Since the random variables are independent, their Covariances are $0$ and I know that, denoting with $\mu =\lim_{n\to\infty} \frac{E(S_n)}{n}, \frac{S_n}{n} \to \mu$ both in probability and almost surely. Since $\mu = \frac{H_n}{n}$ and I know that $\lim_{n\to\infty} \frac{H_n}{\log n} = 1$ I can say that $\mu = 0$.
So, my guess is that $c_1=0$ and here is my proof: \begin{align} P(|S_n|>n\epsilon) \leq \frac{E(S_n^2)}{n^2\epsilon^2} &= \frac{Var(S_n)-E(S_n)^2}{n^2\epsilon^2} \\ &= \frac{n-K_n+H^2_n}{n^2\epsilon^2} \to_{n\to\infty} 0 \end{align}
Now, since if a rv converges in probability to two different limits these are equal almost-surely I say that $c_1=0$ a.s.(although it asks for a constant, so I am not sure if stating this is correct). For the second part, I also think that $c_2$ is $0$ but I do not know how to justify it theoretically and I showed that: \begin{align} P(|S_n|>nH_n\epsilon) \leq \frac{E(S_n^2)}{n^2\epsilon^2} &= \frac{Var(S_n)-E(S_n)^2}{n^2H_n^2\epsilon^2} \\ &= \frac{n-K_n+H^2_n}{n^2H_n^2\epsilon^2} \end{align} And $\sum_{n\geq 1}P(|S_n|>nH_n\epsilon) \leq \sum_{n\geq 1}( \frac{1}{nH_n^2\epsilon^2} - \frac{K_n}{n^2H_n^2\epsilon^2} + \frac{1}{n^2H_n^2\epsilon^2})$ The second and the third term are $O(1/n^2)$ and their summation is $<\infty$, the firt term should be upper bounded by $\sum_{n\geq 1} \frac{1}{n\log^2n}< \infty$ and by the Borel-Cantelli lemma $c_2=0$. Is this argument correct? Also, can I justify $c_2=0$ this way? \begin{align} P(|S_n-n c_2 H_n|>n H_n\epsilon) &\leq \frac{E(S_n^2)-n c_2 H_n E(S_n)+n^2c_2^2H_n^2}{n^2H^2_n\epsilon^2}\end{align} And $\sum_{n\geq 1}\frac{E(S_n^2)}{n^2H^2_n\epsilon^2}<\infty$ but the other two terms are $-\frac{c_2}{n\epsilon^2},\frac{c_2^2}{\epsilon}$ and the only way for it to be $<\infty$ is if $c_2=0$ or $c_2$ grows at least as $1/n$ (and thus, not a constant)?
You are making things too complicated. Note that $\sum P\{X_n \neq 0\} =\sum \frac 1 {n^{2}} < \infty $. By Borel Cantelli Lemma $X_n=0$ for all $n$ sufficiently large, with probability 1. This implies that $X_n \to 0$ almost surely which implies $\frac {S_n} n$ and $\frac {S_n} {nH_n}$ both tend to 0 almost surely. Even independence of $X_n$'s is not required for this!.