$X_{i}\sim bern(\frac{1}{i})$
$S_{n}=\sum_{i=1}^{n}X_{i}$.
Then $\frac{S_{n}}{\ln(n)}\xrightarrow{a.s} 1$
I think I somewhere have to use that $\frac{H_{n}}{\ln(n)}\to 1$. Where $H_{n}$ is the nth harmonic number. As $\sum_{i=1}^{n}\frac{1}{i}$ and $\ln(n)$ just screams out Euler-Mascheroni constant to me.
So I try to bring in expectation somewhere to make that $H_{n}$ appear.
$$P\left(\left|\frac{S_{n}}{\ln(n)}-1\right|\geq \epsilon\right)\leq \frac{E\left[\frac{S_{n}}{\ln(n)}-1\right]}{\epsilon}$$ by Markov's inequality.
This gives me that as $n\to\infty$. $P(|\frac{S_{n}}{\ln(n)}-1|\geq \epsilon)\to 0$ .
But this does not give me almost sure convergence.
also I cannot apply Strong Law of Large numbers as the rv's are not iid. Can anyone tell me how I should proceed. I am lost for ideas .
I think you can use the following result to answer this:
The above can be shown by combining Kronecker's lemma and Kolmogorov's convergence criterion. See, e.g., A Probability Path by Sidney Resnick.
If you take $b_n= \ln n$, then
$$\sum_{n=1}^\infty \frac{\operatorname{Var}(X_n)}{b_n^2}=\sum_{n=2}^\infty \frac{n-1}{n^2 (\ln n)^2}<\sum_{n=2}^\infty \frac{1}{n (\ln n)^2}<\infty\,,$$
where the convergence of the last series is discussed here.
Therefore,
$$\frac{S_n-E(S_n)}{b_n}=\frac{S_n}{\ln n}-\frac{H_n}{\ln n} \stackrel{\text{a.s.}}\longrightarrow 0$$
and $$\frac{H_n}{\ln n}\to 1$$
together imply
$$\frac{S_n}{\ln n}\stackrel{\text{a.s.}}\longrightarrow 1$$