Mathematical justification for incorporating a conditional event in expectation?

97 Views Asked by At

Let $X_1,X_2,\dots$ be independent and identically distributed random variables. Furthermore, consider the sum $$ Y = X_1 + X_2 + \dots + X_N $$ where the number of terms $N$ is itself a random variable, independent of the $X_i$, all defined on the sampe probability space.

Given this preamble, the text I am reading claims the following

\begin{align} E[Y|N=n] &= E[X_1+X_2+\dots+X_N|N=n] \\ &=E[X_1+X_2+\dots+X_n|N=n] \\ &=E[X_1+X_2+\dots+X_n] \end{align}

While I understand the second and third equalities from an intuitive perspective (we know $N=n$, so this information can be incorporated into the number of terms in the sum), how can this be derived mathematically, or in a more pedantic/rigorous fashion using the laws of probability? Do we need to consider the joint distribution of the $X_i$ and $N$? Any ideas would be appreciated.

2

There are 2 best solutions below

1
On BEST ANSWER

Let $I_k(n) = \begin{cases} 1 & : k\in \{1.. n\}\\ 0 & : \text{elsewhere}\end{cases}$

Then $Y = \sum_{k=1}^\infty X_k\; I_k(N)$

$$\begin{align} \mathsf E(Y\mid N=n) & = \mathsf E( \sum_{k=1}^\infty X_k \;I_k(N) \mid N=n) \\ & = \sum_{k=1}^\infty \mathsf E(X_k \;I_k(N)\mid N=n) & \text{linearity of expectation} \\ & = \sum_{k=1}^\infty \mathsf E(X_k \;\mathsf E(I_k(N)\mid N=n)) & \text{independence of r.v.} \\ & = \sum_{k=1}^\infty \mathsf E(X_k I_k(n)) \\ & = \mathsf E(\sum_{k=1}^\infty X_k I_k(n)) \\ & = \mathsf E(\sum_{k=1}^n X_k) \\ \end{align}$$

2
On

Let $S(N)=\sum_{i=1}^{\infty} X_i\mathbf{1}_{\leq N}(i)\implies E[S(N)|N]=N\mu$, where $\mu=E[X_i]$

Therefore, $E[S(N)|N]$ is simply a function of $N$. Once we set $N=n$, we've completely specified the function, just like $f(x)|_{x=2}$ has a definite value.