Conditional expectation of random sums

289 Views Asked by At

A few days ago I came across the following problem:

Let $\{X_n\}_{n\ge 0}$ and $W$ be random variables. Suppose $W : \Omega \to \mathbb{N} \cup \{\infty\}$ and $S_W := \sum_{i = 0}^W X_i \in L^1$. Determine whether or not the random sum $S_W$ satisfies \begin{equation} (1)\hskip2cmE(S_W| W) = \sum_{i = 0}^W E(X_i | W). \end{equation} I know this looks very similar to Wald's identity. However, since we can choose $W$ to be infinity outside of a set of probability $0$, I'm beginning to think that $(1)$ doesn't hold, but I haven't been able to find a counterexample. Is my intuition right, or does $(1)$ actually hold?. Thanks in advance:)

2

There are 2 best solutions below

0
On

To prove that you need to prove two things, $\mathbb E [S_W|W]$ is such that

  • It is $\sigma(W)$-measurable, this is quite trivial to show that $\sum_{i=0}^W \mathbb{E}[X_i\mid W]$ is a measurable function of $W$.

  • For any $\sigma(W)$-measurable random variable $Y$, $\mathbb E[\mathbb E[S_W|W] Y] = \mathbb E[S_W Y]$.

Rewritting the last part you need to show that $$\mathbb{E} \left[\sum_{i=0}^W \mathbb E[X_i|W] Y\right]=\mathbb{E} \left[\sum_{i=0}^W X_i Y\right]$$

This can sometimes be done as \begin{align*} \mathbb{E} \left[\sum_{i=0}^W \mathbb E[X_i|W] Y\right]&=\mathbb{E} \left[\sum_{i=0}^W \mathbb E[X_i Y|W]\right]\\ &=\mathbb{E} \left[\mathbb E\left[\sum_{i=0}^W X_i Y\middle|W\right]\right]\\ &=\mathbb{E} \left[\sum_{i=0}^W X_i Y\right]\\ \end{align*} Using first the fact that $Y$ is $\sigma(W)$-measurable. On the last line it is just the tower property of conditional expectation. You can find all these properties here

For the second line, it is not always true for the infinite case and the answer is not fully known but a sufficient condition for it to hold is that $\sum_{i=0}^\infty \mathbb{E}\left[|X_i|\middle|W=\infty\right]$ converges. So if $\mathbb P (W=\infty)\neq 0$, but you can prove the above convergence, or if $\mathbb P(W=\infty)=0$, then you are done. Otherwise there would be a bit more work for proving the equality.

0
On

Here is my answer. First of all, notice that $S_W$ can be written as follows

\begin{equation} S_W=\sum\limits_{j=0}^\infty X_j\mathbb{1}_{\{W\geq j\}}=\lim\limits_{k\rightarrow\infty}\sum\limits_{j=0}^k X_j\mathbb{1}_{\{W\geq j\}}=\lim\limits_{k\rightarrow\infty}S_W^k. \end{equation}

Due to the linearity of the conditional expectation, we can assume that $X_j\geq0$ for all $j$. In this case, we deduce that $0\leq S_W^k\leq S_W\in L^1(\Omega)$, so $S_W^k\in L^1(\Omega)$ and $E(S_W^k|W)$ is defined. Then we can use the monotone convergence theorem to see that \begin{equation} \begin{split} E(S_W|W)&=\lim\limits_{k\rightarrow\infty}\sum\limits_{j=0}^k E(X_j\mathbb{1}_{\{W\geq j\}}|W)\\ &=\lim\limits_{k\rightarrow\infty}\sum\limits_{j=0}^k \mathbb{1}_{\{W\geq j\}}E(X_j|W)\\ &=\sum\limits_{j=0}^\infty \mathbb{1}_{\{W\geq j\}}E(X_j|W)\\ &=\sum\limits_{j=0}^W E(X_j|W). \end{split} \end{equation} Recall that we can take the indicator function out of the conditional expectation because is a bounded $\sigma(W)$-measurable function.