Find the $E[Y]$ where Y is a summation of N i.i.d Gamma random variables

124 Views Asked by At

Suppose $$Y=\sum_{i=1}^N X_i,$$ where $X_i$'s are i.i.d $\operatorname{Gamma}(\alpha,\beta)$ and $N\sim \operatorname{Poisson}(\mu)$. We also assume that $N$ is independent of $X_i$'s.

  1. Find the $E[Y]$
  2. Find the moment generating function of $Y$
  3. Find the $\operatorname{Cov}(N + Y, 1 + Y)$

By far we have learned moment generating functions and multinomial distribution. However, I can't see a starting point to approach this problem.

Here $N$ is a random variable, what does that imply? In addition, what is matter if $N$ is independent of $X_i$'s?

I would appreciate if anybody can give me some guidance on this question.

3

There are 3 best solutions below

0
On

(Big) Hint: rewrite the sum as $$ Y = \sum_{i=1}^\infty X_i \mathbf{1}_{N \geq i} $$ and then use linearity of expectation to get $$ \mathbb{E}[Y] = \sum_{i=1}^\infty \mathbb{E}[X_i \mathbf{1}_{N \geq i}] $$ Then, use the fact that $N$ is independent of the $X_i$'s.

0
On

\begin{align} & \operatorname E \left( \sum_{i=1}^N X_i \right) \\[8pt] = {} & \operatorname E\left( \operatorname E\left( \sum_{i=1}^N X_i \mathbin{\Big\vert} N \right) \right) \\[8pt] = {} & \operatorname E\left( N\operatorname E(X_1) \right) \\[8pt] = {} & \operatorname E (N) \operatorname E(X_1) \text{ since $\operatorname E(X_1)$ is a constant.} \end{align} A similar technique can be used to find the m.g.f.

Using linearity in each argument separately, the problem on covariances reduces to finding $\operatorname{cov}(N,Y),$ and then you can use this: $$ \operatorname{cov}(A,B) = \operatorname E\big(\operatorname{cov}(A,B\mid N)\big) + \operatorname{cov}\big(\operatorname E(A\mid N), \operatorname E(B\mid N)\big). $$

Notice that the conditional covariance given $N$, of two random variables one of which is $N,$ is $0.$ So you're left with the second term, the covariance between the two conditional expected values.

2
On

The obvious answer to part 1 is $Ee^{X_1}E_NN=\frac{\alpha}{\beta}\mu$, where $E,\,E_N$ respectively denote expectations over $X_i,\,N$. Note next that $Ee^{tX_1}=(1-t/\beta)^{-\alpha}$. For part 2, the MGF is $$E\left[e^{tY}\right]=E\left[\prod_ie^{tX_i}\right]=E_N\left[\prod_{i\le n}E\left[e^{tX_i}\right]\right]=\sum_{n\ge0}e^{-\mu}\frac{\left(\mu Ee^{tX_1}\right)^i}{i!}=e^{\mu\left(Ee^{tX_1}-1\right)}=e^{\mu((1-t/\beta)^{-\alpha}-1)}.$$Revisiting part 1 as a sanity check, the mean is the above function's first derivative at $t=0$, i.e.$$\left.\frac{\mu\alpha}{\beta}(1-t/\beta)^{-\alpha-1}e^{\mu((1-t/\beta)^{-\alpha}-1)}\right|_{t=0}=\frac{\mu\alpha}{\beta}.$$A similar treatment of the second derivative gives $EY^2=\frac{\mu\alpha\left(\mu\alpha+\alpha+1\right)}{\beta^{2}}$. For part 3,$$\begin{align}\operatorname{Cov}(N+Y,\,1+Y)&=E(N+Y+NY+Y^2)-E(N+Y)E(1+Y)\\&=\mu+\frac{\mu\alpha}{\beta}+E(NY)+\frac{\mu\alpha\left(\mu\alpha+\alpha+1\right)}{\beta^{2}}-\left(\mu+\frac{\mu\alpha}{\beta}\right)\left(1+\frac{\mu\alpha}{\beta}\right)\\&=E(NY)+\frac{\mu\alpha(\alpha-\mu\beta+1)}{\beta^2}.\end{align}$$We need to be careful evaluating $E(NY)$: it's$$\sum_{n\ge0}e^{-\mu}\frac{\mu^n}{n!}n^2\frac{\alpha}{\beta}=\frac{\alpha}{\beta}E(N^2)=\frac{\alpha\mu(\mu+1)}{\beta},$$so$$\operatorname{Cov}(N+Y,\,1+Y)=\frac{\mu\alpha(\alpha+\beta+1)}{\beta^2}.$$(You'll want to double-check all these calculations.)