Using the law of large numbers to get a dominating sequence

134 Views Asked by At

Let $X_1, X_2, \dots $ be an iid sequence of random variables such that $EX_1 = \mu$, $E \vert X_1\vert^p < \infty$ for some $1<p<2$ and $E \log X_1 < 0$. The goal is to compute \begin{equation} E(S_\infty) = E \left( \sum_{n=1}^\infty \prod_{k=1}^n X_k \right). \end{equation} My idea was to consider \begin{align} \prod_{k=1}^n X_k = & \exp \left( \sum_{k=1}^n \log X_k \right) \\ = & \exp \left( (\sum_{k=1}^n \log X_k ) / n \right)^n \\ = & \exp \left( E \log X_1 + o (n^{1/p -1} \right)^n \\ = & \exp \left(n E \log X_1 + o (n^{1/p} \right) \\ \leq & \exp (-k \epsilon) \end{align} for some $\epsilon >0$. I here used Theorem 2.5.12 from Durrett's book Probability: Theory and Examples. Then I got a dominating sequence that converges so I could use dominated convergence to take the exptecation inside the sum in above equation to get \begin{equation} E (S_\infty) = \sum_{n=1}^\infty (EX_1)^n \end{equation} Is this correct? The part with the dominating sequence feels weird and the conclusion too... I am fairly sure that this is false but I can't find my mistake.

Here a screenshot of the Theorem

enter image description here

1

There are 1 best solutions below

0
On BEST ANSWER

If the random variables are non-negative (which is suggested by the fact you are considering logarithms), then you can always put the expectation sign under the sum thanks to Tonelli's theorem, in particular, you don't have ot assume that the logarithm has negative expectation. In the case $\mathrm E[X_1]\ge 1$, the series is not integrable (and in the case where the the logarithm has positive expectation it even diverges almost surely), but then the both sides of the equality are infinite.