find the $Var[X(t)]$, where $X(t)=\sum_{i=0}^{N(t)}Y_i$ , $N(t)$ is Poisson process and $Y_i$ is IIDRV

49 Views Asked by At

the answer is $E[N(t)]Var[Y_1]+E^2(Y_1)Var[N(t)]$.

but I get this: $$ Var[X(t)]=E[X^2(t)]-(E[X(t)])^2=E[N^2(t)]E[Y_i^2]-E^2[N(t)]E^2[Y_1]$$ where: $$ E[X(t)]=E[E[\sum_{i=0}^{N(t)}Y_i|N(t)=n]]=E[N(t)]E[Y_1]\\ E[X^2(t)]=E[E[(\sum_{i=0}^{N(t)}Y_i)^2|N(t)=n]]=\sum_{n=0}^{\infty}n^2E[Y_i^2]P(N(t)=n)=E[N^2(t)]E[Y_i^2] $$

Am I wrong? how to get the correct result?

2

There are 2 best solutions below

0
On BEST ANSWER

I personally find it insightful to use the exponential generating function for this type of problem, since it allows one to calculate moments of arbitrary order. With the convention $M_X(\lambda)=\mathbb{E}[e^{\lambda X}]$, one can show that since the variables are iid

$$M_{X(t)}(\lambda)=\mathbb{E}_{N(t)}\mathbb{E}\left[\exp\left(\lambda\sum_{i=1}^{N(t)}Y_i\right)\right]=\mathbb{E}_{N(t)}[(M_{Y_1}(\lambda))^{N(t)}]$$

Extract moments of $X$ by taking derivatives with respect to $\lambda$ and evaluating at $\lambda=0$. For example

$$\mathbb{E}[X(t)]=M'_{Y_1}(\lambda)\mathbb{E}_N[N (M_{Y_1}(\lambda))^{N-1}]\Big|_{\lambda=0}=E(Y)E(N(t))$$ and for the second moment $$\mathbb{E}[X^2(t)]=M''_{Y_1}(\lambda)\mathbb{E}_N[N (M_{Y_1}(\lambda))^{N-1}]+(M'_{Y_1}(\lambda))^2\mathbb{E}_N[N(N-1) (M_{Y_1}(\lambda))^{N-2}]\Big|_{\lambda=0}\\=E(Y^2)E(N(t))+(E(Y))^2(E(N^2(t))-E(N(t)))$$

Now the variance is easy to calculate and it yields

$$Var(X(t))=E(N)Var(Y)+E(Y)^2 Var(N)$$

Note: You can obtain the exact same result by carrying your calculation through but taking into account that

$$\mathbb{E}[(\sum_{i=1}^n Y_i)^2|N(t)=n]=nE(Y_1^2)+(n^2-n)E(Y_1)^2$$

since there are $n$ diagonal terms of the form $Y_i^2$ that contribute $\mathbb{E}[Y_1^2] $and $n(n-1)$ off-diagonal ones of the form $Y_iY_j$ that contribute $\mathbb{E}[Y_1]^2$ (you have to use iid-ness to argue these). Then it is (almost) trivial to see that $n^k\to \mathbb{E}[N^k]$ when you apply the expectation value over $N$.

0
On

Your calculation for $\mathbb{E}[X^2(t)]$ is not quite right – when you expand out the square, you would also get the cross terms, e.g. $\mathbb{E}[Y_i Y_j]$, $i \ne j$.

Analogous to the computation of $\mathbb{E}[X(t)]$ – which uses the law of total expectation -- there is a version for variances: $$ \mathrm{Var}(X(t)) = \mathbb{E}[\mathrm{Var}(X(t) \mid N(t))] + \mathrm{Var}(\mathbb{E}[X(t) \mid N(t)]) . $$ (Can you prove this?) If you use this result instead, you should be able to obtain $\mathrm{Var}(X(t))$ using similar arguments that you already have used.