What are the general results for when it is allowed to take the expectation and variance operator inside an infinite sum involving random variables?
So, when can I say $\operatorname E \sum_{i = 1}^\infty X_i = \sum_{i = 1}^\infty \operatorname E X_i$, and likewise for the variance? What conditions must be satisfied?
For this question, assume $(X_i)$ is always iid, and all infinite sums exist almost surely.
$$ \operatorname E\left( \sum_i X_i \right) = \sum_i \operatorname E X_i \text{ ?} \tag 1 $$
The above is true regardless of whether the random variables are i.i.d., but the third one assumes convergence of the infinite sum. But if all of the terms have equal expectation, then of course the sum of expectations diverges to $+\infty$ or $-\infty$ unless the expectation of each term is $0.$
With variances it's more complicated. With finitely many terms in the sum, the sum of the variances equals the variance of the sum if the random variables are uncorrelated. That is a weaker assumption than independence. And it is weaker than pairwise independence. So it holds if they're independent. Since you assumed independence, that's all you need. Maybe I'll add some more on this later$\,\ldots$