Random sum of normal distribution with bounds Poisson distributed?

497 Views Asked by At

A random variable, $M$, is Poisson distributed with $\lambda=2$. ${X_1,X_2,\dots}$ are independently identically distributed random variables with $\mu=3$ and $\sigma=.2$.

Introduce a new random variable, $Z$, which is defined as: $Z=\sum_{k=1}^{M}X_k$.

We are also given that $Z=0$ if $M=0$ and are asked to calculate the first and second moments of $Z$.


Since this is a discrete distribution, can I calculate $E[Z]$ as the sum of the expectations? So, $E[Z] = \sum_{k=1}^{M}X_k= k\mu = 3k$?

1

There are 1 best solutions below

1
On BEST ANSWER

If you mean that $L = M$, then $$\operatorname{E}[Z] = \operatorname{E}[\operatorname{E}[Z \mid M]] = \operatorname{E}\left[\left. \sum_{k=1}^M \operatorname{E}[X_k] \;\right| M \right] = \operatorname{E}[M \mu] = \mu \operatorname{E}[M] = \mu \lambda.$$ The second moment is more difficult, since $\operatorname{E}[Z^2]$ is messy to deal with using the above approach. Instead, it is easier to use the law of total variance: $$\operatorname{Var}[Z] = \operatorname{E}[\operatorname{Var}[Z \mid M]] + \operatorname{Var}[\operatorname{E}[Z \mid M]].$$ We already know that the conditional variable $Z \mid M$ is normal with mean $M\mu$ and variance $M \sigma^2$; thus the above reduces to $$\operatorname{Var}[Z] = \operatorname{E}[M \sigma^2] + \operatorname{Var}[M \mu] = \sigma^2 \operatorname{E}[M] + \mu^2 \operatorname{Var}[M].$$ Now since $M$ is Poisson with mean equal variance equal to $\lambda$, it follows that $$\operatorname{Var}[Z] = \lambda(\mu^2 + \sigma^2).$$ To finish, we simply have $$\operatorname{E}[Z^2] = \operatorname{Var}[Z] + \operatorname{E}[Z]^2 = \lambda(\mu^2 + \sigma^2) + \lambda^2 \mu^2.$$


In general, we can see that for any random variable $X$ from which IID observations $X_1, X_2, \ldots$ are drawn, and a discrete random variable with nonnegative integer support $M \in \{0,1,2, \ldots\}$, we have $$\operatorname{Var}\left[\sum_{k=1}^M X_k \right] = \operatorname{Var}[X]\operatorname{E}[M] + \operatorname{E}[X]^2 \operatorname{Var}[M],$$ whenever such moments exist. This is an immediate consequence of the law of total variance and does not rely on distributional assumptions, as we have seen above.