I have an independent sequence of random variables $(X_i)_{i\geq 1}$ such that $E[X_i] = \mu_i > 0$ and $E[X_i^2] < \infty$. I know that $\frac{1}{n}\sum_{i=1}^n\mu_i \to \mu < \infty$ as $n\to\infty$. Is it then true that
$$\frac{1}{n}\sum_{i=1}^nX_i \to \mu \quad \text{almost surely}$$
If it helps, the distribution (law) of $X_i$ is parametrized by one parameter for each $i$ (e.g. exponential distribution) and apart from the difference in this characterizing parameter all $X_i$'s have the same distribution.
Yes. Consider the centered variables $Y_n=X_n-\mu_n$. They are still independent and have all the same mean. Then you can apply the classical law of large numbers to deduce that $\frac{1}{n}\sum_{i=1}^nY_i\to 0$ almost surely. This means that $$\frac{1}{n}∑_{i=1}^nX_i-\frac{1}{n}\sum_{i=1}^n\mu_i\to 0$$ almost surely. But as the limit of the sum of the means exists, this implies the desired result.
In fact, you can prove a more general statement:
Proposition 1. Let $X_1,\ldots$ be an infinite sequence of independent random variables and let $S_n=\tfrac{1}{n}\sum_{i=1}^nX_i$. Denote by $m_n=\mathbb{E}X_n$ the mean of $X_n$ and define $M_n=\tfrac{1}{n}\sum_{i=1}^nm_i$. Then, if the sequence of variances $\sigma_n^2=\mathbb{E}(X_n-m_n)^2$ is bounded, and $M_n\to m$ for some $m\in\mathbb{R}$, then $S_n\to m$ in probability, hence almost surely.
Proof. This follows from a standard application of Chebyshev's inequality. Note that for any $\epsilon>0$, $$\mathbb{P}\left(|S_n-M_n|\geq \epsilon\right)\leq\frac{\sigma_n^2}{n\epsilon^2}\to 0.$$