Resnick - Probability Path - Exercise 7.6

586 Views Asked by At

I'm trying to solve this exercise:

Suppose ${X_k,k \ge 1}$ are independent randon variables and suppose $X_k$ has a gamma density $f_k(x)$,

$f_k(x)=\frac{x^{\gamma_k-1}\rm{e}^{-x}}{\Gamma(\gamma_k)}, x>0, \gamma_k>0$.

Give necessary and sufficient conditions for $\sum\nolimits_{k = 1}^\infty {{X_k}} $. to converge almost surely. (Compare with the treatment of sums of exponentially distributed random variables).

I know $X_k \sim Gama(\gamma_k,1)$, then $E(X_k)=V(X_k)=\gamma_k$.

So, by the theorem of Kolmogorov Convergence Criterion, we have: If $\sum\nolimits_{k = 1}^\infty {{V(X_k)}}< \infty $, then $\sum\nolimits_{k = 1}^\infty {[{X_k}-E(X_k)]}$ converges almost surely.

Let $M$ such that $\sum\nolimits_{k = 1}^\infty {[{X_k}-\gamma_k]} \to M $ converges almost surely.

Thus, for $\sum\nolimits_{k = 1}^\infty {\gamma_k}<\infty $ $\Rightarrow $
$\sum\nolimits_{k = 1}^\infty {X_k}\rightarrow M+\sum\nolimits_{k = 1}^\infty {\gamma_k}<\infty $ a.s.

Therefore, can I say that is sufficient and necessary that $\sum\nolimits_{k = 1}^\infty {\gamma_k}<\infty$?

Or do I need to show the other side, like

if $\sum\nolimits_{k = 1}^\infty {{X_k}}$ converge a.s., then $\sum\nolimits_{k = 1}^\infty {\gamma_k}<\infty$?

1

There are 1 best solutions below

0
On BEST ANSWER

The other side can be proved as follows:

Assume that $\sum_n X_n$ converges a.s.,

So, by Kolmogorov's 3 series we have that:

(i)$\sum_n P(|X_n|>c)<\infty \Rightarrow P(|X_n|>c)\rightarrow0$,

which means $X_n$ is bounded, i.e. $\exists$c such that $|X_n|<c, \forall n\in \mathbb{N}$. Be $c_0$ such value, thus:

(ii) $\sum_n Var(X_n 1_{[|X_n|\leq c_0]}) = \sum_n Var(X_n) = \sum_n \gamma_n <\infty$

(iii) $\sum_n E(X_n 1_{[|X_n|\leq c_0]}) = \sum_n E(X_n) = \sum_n \gamma_n <\infty$

$\Rightarrow \sum_n \gamma_n <\infty$