Convergence of the sum of moments implies convergence of the sum of arbitrary random variables

161 Views Asked by At

I'm trying to solve the following

Let $\{X_j\}$ be a sequence of arbitrary random variables such that $$\sum_{j=1}^{\infty}\mathbb{E}[|X_j|^{r_j}]<\infty$$ for some sequence $\{r_j\}$ in $(0,1]$. Then $$\sum_{j=1}^{\infty}X_j$$ converges absolutely a.s.

I tried it using Markov's Inequality since $$\mathbb{P}(S_n \text{ converges })=\lim_m \lim_n \lim_r\mathbb{P}\left(\max_{n\le k \le r}|S_k-S_n|\le \frac{1}{m}\right)\le \lim_m \lim_n \lim_r\mathbb{P}\left(|S_r-S_n|\le\frac{1}{m}\right)$$

So, $$\mathbb{P}\left(|S_r-S_n|\ge\frac{1}{m}\right)=\mathbb{P}\left(|\sum_{j=n+1}^r X_j|\ge\frac{1}{m}\right)\le m^{h_{n}^r}\mathbb{E}[|\sum_{j=n+1}^rX_j|^{h_{n}^r}]$$

where $h_n^r=\min\{r_j: n+1\le j\le r\}$

From here, I'm stuck.

I'll appreciate any help.

1

There are 1 best solutions below

1
On BEST ANSWER

The assumption $\sum_{j=1}^{\infty}\mathbb{E}[|X_j|^{r_j}]<\infty$ implies that $\sum_{j=1}^{\infty} |X_j|^{r_j}$ is almost surely finite, which implies in turn that $|X_j|^{r_j}\to 0$ almost surely. Since $0<r_j<1$, this implies that $|X_j| \to 0$ almost surely. Now, for the $\omega$ such that $\sum_{j=1}^{\infty} |X_j|^{r_j}(\omega)$ is finite, the convergence of $$\sum_{j=1}^{\infty} |X_j| (\omega)$$ follows from $|X_j(\omega)| = |X_j(\omega)|^{r_j}|X_j(\omega)|^{1-r_j}$ and the fact that $|X_j(\omega)|^{1-r_j}\leqslant 1$ for $j$ large enough.