Rigorous argument for splitting a sum according to a random time

40 Views Asked by At

Let $X_1, X_2, \dots$ be some random variables and let $\tau_1, \tau_2, \tau_3$ be some interger-valued random variables such that $\tau_1 < \tau_2 < \tau_3$ almost surely. All these random variables are not assumed to be independent!

Now, consider the statement $$ E \left[ \sum_{k=\tau_1}^{\tau_3} X_k \right] = E \left[ \sum_{k=\tau_1}^{\tau_2} X_k + \sum_{k=\tau_2 +1}^{\tau_3} X_k \right] = E \left[ \sum_{k=\tau_1}^{\tau_2} X_k \right] +E \left[\sum_{k=\tau_2 +1}^{\tau_3} X_k \right].$$

The first identity holds naturally. However, does the second identity hold as obviously? If the statement is correct, how could you rigorously show that it is indeed correct?

If the statement is incorrect, why and what would one need to assume about the random variables for this to be true?

1

There are 1 best solutions below

6
On BEST ANSWER

Let ${F}$ be the sigma field generated by $\sum_{k=\tau_1}^{\tau_2} X_k$. By the tower rule it holds $$E \sum_{k=\tau_1}^{\tau_2} X_k + \sum_{k=\tau_2+1}^{\tau_3} X_k = E E (\sum_{k=\tau_1}^{\tau_2} X_k + \sum_{k=\tau_2+1}^{\tau_3} X_k |F) = E \sum_{k=\tau_1}^{\tau_2} X_k + EE (\sum_{k=\tau_2+1}^{\tau_3} X_k |F)= \quad E \sum_{k=\tau_1}^{\tau_2} X_k + E \sum_{k=\tau_2+1}^{\tau_3} X_k $$ In the second step it is used that $\sum_{k=\tau_1}^{\tau_2} X_k$ is F measurable.

Consequently the equality does hold

Edit (Thanks for your remark):

One can use the linearity of the expectation immediately.

$E(X+Y) = E(X) + E(Y)$ for $Y:= \sum_{k=\tau_1}^{\tau_2} X_k, Y:= \sum_{k=\tau_2+1}^{\tau_3}X_k $. Linearity of the expectation is also implicitly used in my orignial answer.