sequence of idd exponential random variables

57 Views Asked by At

I'm having trouble understanding a proof in JR Norris's Markov Chains.

Let $S_1, S_2, \ldots$ be a sequence of independent random variables with $S_n \sim E(\lambda_n)$ and $0<\lambda_n < \infty$ for all $n$. Then, if $\sum_{n=1}^\infty \frac{1}{\lambda_n} = \infty$, then $P(\sum_{n=1}^\infty S_n = \infty) = 1$. The proof he gives goes as follows:

By monotone convergence and independence, $$\mathbb{E}(exp(-\sum_{n=1}^\infty S_n)) = \prod_{n=1}^\infty \mathbb{E}(exp(-S_n)) = \prod_{n=1}^\infty (1 + \frac{1}{\lambda_n})^{-1} = 0$$ so we have that $$P(\sum_{n=1}^\infty S_n = \infty) = 1 $$ First, I'm not sure how to apply monotone convergence theorem correctly to get from the first step to the second step, the expectation of the negative exponentiated sum to the product. Secondly, I'm not sure why this product being 0 implies the the last equation, the probability of the sum being infinity equal to 1.

Could somebody elucidate those two steps for me?

1

There are 1 best solutions below

0
On BEST ANSWER

There is an exponent rule that says $\exp(a + b) = \exp(a) \cdot \exp(b)$. This gives us the first part of the first transformation, which it helps to write out with $\dots$ if it's easier:

$$ E\left(exp\left(-\sum_{n=1}^{\infty} S_n\right)\right) = E\left(exp\left(-(S_1 + S_2 + S_3 + \dots)\right)\right) = E\left(exp\left(-S_1 - S_2 - S_3 - \dots)\right)\right) = E(\exp(-S_1) \cdot \exp(-S_2) \cdot \exp(-S_n) \cdot \dots) = E\left(\prod_{n=1}^{\infty} \exp(-S_n)\right) $$

One property of independence is that if $X$ and $Y$ are independent random variables, then $E(X \cdot Y) = E(X) \cdot E(Y)$. This gives us the second part of the first transformation: $$ E\left(\prod_{n=1}^{\infty} \exp(-S_n)\right) = \prod_{n=1}^{\infty} E(\exp(-S_n)) $$

The second transformation uses the monotone convergence theorem. Some intuition on this is if we have $x, y \in \mathbb{R_{++}}$ we know that $$ \frac{1}{1 + \frac{1}{\lambda_1}} \cdot \frac{1}{1 + \frac{1}{\lambda_2}} = \frac{1}{\left(1 + \frac{1}{\lambda_1}\right)\left(1 + \frac{1}{\lambda_2}\right)} = \frac{1}{1 + \frac{1}{\lambda_1} + \frac{1}{\lambda_2} + \frac{1}{\lambda_1 \lambda_2}} $$

Notice that the two middle terms are $\frac{1}{\lambda_1} + \frac{1}{\lambda_2}$. If we also had $\lambda_3$, we would see the three middle terms would be $\frac{1}{\lambda_1} + \frac{1}{\lambda_2} + \frac{1}{\lambda_3}$. Continue this to infinite terms with the sequence $(\lambda_n)$ and you will see that the denominator has the middle terms $\sum_{n=1}^{\infty} \frac{1}{\lambda_n}$ which is $\infty$ by the assumption. Of course, $\frac{1}{\infty + \text{some other positive numbers}} = 0$.

The last part is implied because an exponential function can only be $0$ if the exponent is $-\infty$. If $\sum_{n=1}^{\infty} S_n < \infty$ then $\exp(-\sum_{n=1}^{\infty} S_n) > 0$. If $\sum_{n=1}^{\infty} S_n = \infty$ then $\exp(-\sum_{n=1}^{\infty} S_n) = 0$.

But

$$ E\left(\exp\left(-\sum_{n=1}^{\infty} S_n\right)\right) = \exp\left(-\infty \right) P\left(\sum_{n=1}^{\infty} S_n = \infty \right) + \exp\left(-S \right) P\left(\sum_{n=1}^{\infty} S_n = S < \infty\right) = 0. $$

The first term is $0$ regardless of $P\left(\sum_{n=1}^{\infty} S_n = \infty \right)$ since exponential is $0$. So we have $$ \exp\left(-S \right) P\left(\sum_{n=1}^{\infty} S_n = S < \infty\right) = 0. $$

The exponential is $> 0$. Thus $P\left(\sum_{n=1}^{\infty} S_n = S < \infty \right)$ must be $0$ since we know the whole thing adds up to $0$ and so cannot be a positive number.