Expectation of sum of N random variables equality

75 Views Asked by At

Let $N$ be Poisson distributed with parameter $\lambda$. Show that, for any function $g$ such that the expectation of $g(S)$ exists, if $S=\sum_{r=1}^N X_r$, where $\{X_r:r\geq 0 \}$ are i.i.d non-negative integer-valued random variables, show that: $$ \mathbb{E}(Sg(S))=\lambda\mathbb{E}(g(S+X_0)X_0) $$

I'm trying to teach myself probability using Grimmett & Stirzaker, and this exercise (3.8.6.) really confuses me. I have the solution: Conditioning on $N$ and $X_N$: $$ \text{LHS}=\mathbb{E}(\mathbb{E}(Sg(S)|N))= \mathbb{E}\{N\mathbb{E}(X_Ng(S)|N)\}\\ =\sum_n \frac{e^{-\lambda}\lambda^n}{(n-1)!} \int x\mathbb{E}(g(\sum_{r=1}^{n-1}X_r+x))dF(x)\\ =\lambda \int x\mathbb{E}(g(S+x))dF(x)=\text{RHS} $$

First I got $$ \mathbb{E}(X_N g(S)|N)\\ =\mathbb{E}(\mathbb{E}(X_Ng(S)|X_N)|N) \quad \text{(by tower property)}\\ =\mathbb{E}(X_N\mathbb{E}(g(\sum_{r=1}^{N-1}X_r+X_N)|X_N)|N) $$ Then plug this back in the integration, but how the conditional expectiation on $X_N$ just becomes regular expectation $\mathbb{E}(g(S+x))$? I might have misunderstood this completely, because I thought you cannot do that with $g(S)$. If it's $\mathbb{E}(\sum_{r=1}^{N-1}X_r+X_N|X_N)$, then I get it's equal to $X_N+\mathbb{E}(\sum_{r=1}^{N-1}X_r). $I'm also not sure if I can use tower property like that, is conditional on $N$ less information then on $X_N$?

Lastly, the second to last equality just lost me, where did the stuff before integration go?

1

There are 1 best solutions below

0
On

First, note that, when writing mathematics, you should always define to what notations are referring to. $F$ is not defined in your text, obviously it is the law of $X_N$, but it should be defined.

For your problem:

On the second line you have

$$ \sum_{n\geq 0}\frac{e^{-\lambda}\lambda^{n} }{n!} n \int_{\mathbb{R}}\mathbf{E}\left[x g\left(x+\sum_{i=1}^{n-1}X_i\right)\right] dF(x)$$ which is equal to $$ \lambda \sum_{n\geq 1}\frac{e^{-\lambda}\lambda^{n-1}}{(n-1)!} \int_{\mathbb{R}}\mathbf{E}\left[x g\left(x+\sum_{i=1}^{n-1}X_i\right)\right] dF(x).$$ Perform a change of indices to get $$ \lambda \sum_{k\geq 0}\frac{e^{-\lambda}\lambda^{k}}{k!} \int_{\mathbb{R}}\mathbf{E}\left[x g\left(x+\sum_{i=1}^{k}X_i\right)\right] dF(x).$$ As, $F$ is also the law of $X_0$, we have $$ \lambda \sum_{k\geq 0}\frac{e^{-\lambda}\lambda^{k}}{k!} \int_{\mathbb{R}}\mathbf{E}\left[x g\left(x+\sum_{i=1}^{k}X_i\right)\right] dF(x)=\lambda \sum_{k\geq 0}\frac{e^{-\lambda}\lambda^{k}}{k!} \mathbf{E}\left[X_0 g\left(X_0+\sum_{i=1}^{k}X_i\right)\right],$$ but $$ \sum_{k\geq 0}\frac{e^{-\lambda}\lambda^{k}}{k!} \mathbf{E}\left[X_0 g\left(X_0+\sum_{i=1}^{k}X_i\right)\right]= \mathbf{E}\left[X_0 g\left(X_0+\sum_{i=1}^{N}X_i\right)\right] $$