Find the variance of $S_N$ in terms of the means and variances of $X$ and $N$

278 Views Asked by At

I got this problem from Introduction to Probability by Anderson D, Seppalainen T, Valko B. It is challenge problem 10.55.

Let $X_1,X_2,...$ be i.i.d. random variables and $N$ an independent nonnegative integer valued random variable. Let $S_N=X_1+...+X_N$. Assume that the m.g.f. of $X_i$, denoted $M_X(t)$, and the m.g.f. of $N$, denoted $M_N(t)$ are finite in some interval $(-\delta,\delta)$ around the origin.

  1. Express the m.g.f. $M_{S_N}(t)$ of $S_N$ in terms of $M_X(t)$ and $M_N(t)$.
  2. Assume $N\sim Poisson(\lambda)$ and $X\sim Ber(p)$, deduce what is the distribution of $S_N$ from part 1.
  3. Compute the second moment $\mathbb{E}[S^2_N]$ in terms of the moments of $X$ and $N$ and deduce the variance of $S_N$ in terms of the means and variances of $X$ and $N$.

My Attempt

  1. We have the formula: If $X_1,...,X_n$ are independent, then $$M_{X_1+...+X_n}(t)=M_{X_1}\cdot...\cdot M_{X_n}(t)$$ So $M_{S_N}=M_{X_1}(t)\cdot...\cdot M_{X_N}(t)$ but we need it in terms of $M_X{t}$ and $M_{N}(t)$. This is the part I am having trouble on since $N$ is a random variable, not a number.

A hint I was given is that I should decompose the computation of the expectation by conditioning w.r.t. $N$.

So if I try that, I have $$\mathbb{E}[e^{tS_N}|N=1]=\mathbb{E}[e^{tX_1}]$$ $$\mathbb{E}[e^{tS_N}|N=2]=\mathbb{E}[e^{tX_1}]\mathbb{E}[e^{tX_2}]$$ and so on... But how would I put this together to find just $\mathbb{E}[e^{tS_N}]$ and make it in terms of $M_X(t)$ and $M_N(t)$?

  1. For this part, isn't the second moment equal to the variance? From what I understand about moment generating functions, the first moment is the expectation and the second moment is the variance. Why does it ask for both?
1

There are 1 best solutions below

5
On

So for the first part we can write, \begin{align} M_{S_N}(t) = \mathbb{E}[e^{tS_{N}}] &= \mathbb{E}[e^{t\sum_{i=1}^{N}X_i}] \\ &= \mathbb{E}\left[\mathbb{E}[e^{t\sum_{i=1}^{N}X_i}|N]\right], \hspace{5mm} \text{l.i.e} \\ &= \mathbb{E}\left[(\mathbb{E}[e^{tX_1}])^N\right] \\ &= \mathbb{E}\left[e^{\ln(M_X(t))N}\right] \\ &= M_N(\ln(M_X(t))). \end{align}

For the third part, No the second moment is in general not equal to the variance. Moments are given by $\mathbb{E}[X^k]$ with $k = 1,2,...$ So the variance is equal to the second moment if $\mathbb{E}[X] = 0$.

Now we have that the $k$'th moment of $X$ can be found by taking the $k$'th derivative of the moment generating function of $X$ and evaluating it at $0$, i.e. $\mathbb{E}[X^k] = M_{X}^{(k)}(0)$. Here $(k)$ denotes the $k$'th derivative.