Uniform integrability for a specific martingale

245 Views Asked by At

Suppose I have an i.i.d sequence of random variables $X_1,X_2 \ldots $ such that $\mathbb{P}(X_i =+1)=\mathbb{P}(X_i = -1)=0.5$. I need to prove that the random series $\sum_{k \geq 1} \frac{X_k}{k}$ converges a.s. and in $L^1$.

I should use the theorem that states that if I have $(M_n)_n$ martingale w.r.t $(F_n)_n $ filtration and let $F_{\infty}=\sigma (F_n:n \in \mathbb{N})$, then $(M_n)_n$ is uniformly integrable iff there exists a random variable $F_{\infty}$-measurable $M$, such that $M_n \rightarrow M$ a.s. and in $L^1$.

So I should just check uniform integrability. From the text of the exercise, I think that the "obvious" martingale would be $M_n=\sum_{k=1}^{n} \frac{X_k}{k}$.

Now I get stuck, because I tried to check uniform integrability by checking that $\sup_n E[|M_n|] < +\infty$, but if I bound those sums I have:

$E[|M_n|] \leq \sum_{k=1}^{n} \frac{1}{k}E[|X_k|]= \sum_{k=1}^{n} \frac{1}{k}$ and this diverges as $n$ goes to $+\infty$.

What am I missing in uniform integrability?

2

There are 2 best solutions below

0
On

First of all, note that $L^1$-boundedness does not imply uniform integrability, i.e. $\sup_{n \geq 1} \mathbb{E}|M_n|<\infty$ does not imply that $(M_n)_{n \in \mathbb{N}}$ is uniformly integrable.

However, $L^2$-boundedness does imply uniform integrability, and in fact $(M_n)_{n \in \mathbb{N}}$ is $L^2$-bounded. Since the random variables $X_n$ are independent, have mean zero and variance $1$, we have

$$\mathbb{E}(M_n^2) = \sum_{j=1}^n \sum_{k=1}^n \frac{1}{kj} \underbrace{\mathbb{E}(X_j X_k)}_{=0 \, \, \text{for $j \neq k$}} = \sum_{k=1}^n \frac{1}{k^2} \underbrace{\mathbb{E}(X_k^2)}_{=1},$$

and so

$$\sup_{n \in \mathbb{N}} \mathbb{E}(M_n^2) \leq \sum_{k \geq 1} \frac{1}{k^2} < \infty.$$

0
On

Although it can be done with martingales, it's much easier to do it with Kolmogorov's two-series theorem. (In probability courses, this theorem is usually taught before martingales anyway, and requires less machinery to demonstrate--for instance, no conditional expectation.) Essentially, it says given any collection of random variables $A_k$, sufficient conditions for almost sure convergence of $\sum_{k=1}^\infty A_k$ are:

  1. The sum of the expectations $\sum_{k=1}^\infty \Bbb{E}[A_k]$ converges, and
  2. The sum of the variances $\sum_{k=1}^\infty \operatorname{Var}(A_k)$ converges.

You're trying to prove that $\sum_{k=1}^\infty X_k/k$ converges a.s., where the $X_i$ are i.i.d. uniform on ${-1, +1}$. Since $\Bbb{E}[X_k/k] = 0$ and $\operatorname{Var}(X_k/k) = 1/k^2$, it follows that the sums $\sum_{k=1}^\infty \Bbb{E}[X_k/k]$ and $\sum_{k=1}^\infty \operatorname{Var}(X_k/k)$ are both convergent, so $\sum_{k=1}^\infty X_k/k$ converges a.s.