Expectation of almost surely finite sum

125 Views Asked by At

Let $T$ be an almost surely finite random variable taking values in $\Bbb N$. Let $X_k$ be a sequence of random variables.

What can we say about $\displaystyle\Bbb E\left( \sum_{k=1}^T X_k\right)$ ?

I thought that $$\Bbb E\left( \sum_{k=1}^T X_k\right) = \sum_{k=1}^T \Bbb E(X_k)$$ but its apparently wrong unless $T$ is almost surely constant (which is basic linearity).

1

There are 1 best solutions below

0
On BEST ANSWER

If $T$ is almost surely constant , then $E(\sum_{k=1}^{T}X_{k})=\sum_{k=1}^{T}E(X_{k})$ .

However, there are cases where even if $T$ is not constant, you have nice identities.

First consider the case when $T$ is an positive integer valued random variable that is independent of $X_{i}$'s . Also suppose that $E(X_{k})=\mu$ for all $k$.

Then $E(\sum_{k=1}^{T}X_{k})=E(E(\sum_{k=1}^{T}X_{k}|T))=E(\sum_{k=1}^{T}E(X_{k}|T))=E(T\cdot\mu)=\mu E(T)$ .

Consider another case when $T$ is a stopping time with respect to the filtration $(\mathcal{F}_{n})_{n}=(\sigma(X_{1},....,X_{n}))_{n}$

To put it in lighter , more digestible terms, a random variable $T$ is a stopping time wrt to the collection $(X_{n})_{n}$ of random variables if $\{T\leq k\}$ is determined by $X_{1},...,X_{k}$ .

Then in this case, we have what is called Wald's Identity, which tells you that if $X_{n}$ are iid and $T$ is a stopping time wrt $(X_{k})_{k}$ such that $E(T)<\infty$ then $E(\sum_{k=1}^{T}X_{k})=E(X_{1})E(T)$ .