Joint Moment Generating Function from Conditional and Marginal Distribution

2.1k Views Asked by At

Suppose that that random variable $N$ follows a Poisson distribution with mean $\lambda=6$. Suppose that the conditional distribution of the random variable $X$, given that $N=n$, follows that of a $Binomial(n,0.6)$.

Find the Joint Moment Generating Function of $(N, X)$.

Initially I just tried to use the definition. I found the joint PMF using the definition of the conditional distribution, but then I have to sum over both of them in order to find the joint MGF, and this was the step I was stuck at because trying to do a double sum over the product of the binomial and Poisson PMFs doesn't exactly go over very nicely.

Since this problem does NOT assume independence, I can't exactly attempt to use that to my advantage either... so now I'm stuck. How can I calculate this joint MGF?

2

There are 2 best solutions below

0
On

The summations are doable. The joint pmf is $$f_{X, N}(k, n) = \operatorname P(X = k \mid N = n) \operatorname P(N = n) = \binom n k p^k (1 - p)^{n - k} e^{-\lambda} \frac {\lambda^n} {n!}.$$ Sum over $k$ first: $$\sum_{k = 0}^n e^{s k + t n} f_{X, N}(k, n) = e^{-\lambda} \frac {\lambda^n} {n!} e^{t n} \sum_{k = 0}^n \binom n k (p e^s)^k (1 - p)^{n - k} = \\ e^{-\lambda} \frac {\lambda^n} {n!} e^{t n} (p e^s + 1 - p)^n.$$ Then sum over $n$: $$\operatorname E\left[ e^{s X + t N} \right] = \sum_{n=0}^\infty \sum_{k = 0}^n e^{s k + t n} f_{X, N}(k, n) = \\ e^{-\lambda} \sum_{n=0}^\infty \frac {(\lambda e^t (p e^s + 1 - p))^n} {n!} = \\ \exp( -\lambda + \lambda e^t (p e^s + 1 - p)).$$

0
On
  • Firstly: $N\sim\mathcal{Poisson}(6)$ so $\mathsf M_N(u)=\mathrm e^{6(\mathrm e^u-1)}$.
    • You may find this by definition $\mathsf M_N(u)=\mathsf E(\mathrm e^{uN})=\sum_{n=0}^\infty \tfrac{(6\mathrm e^{u})^n\mathrm e^{-6}}{n!}$ and recall $\mathrm e^x:=\sum_{n=0}^\infty (x^n/n!)$
  • Secondly: $X\mid N\sim\mathcal{Binomial}(N, 0.6)$ so $\mathsf M_{X\mid N}(v)=(0.4+0.6\mathrm e^v)^{N}$
    • Likewise $\mathsf M_{X\mid N}(v)=\sum_{k=0}^N\binom{N}{k}(0.6\mathrm e^v)^k(0.4)^{N-k}$, and recall the Binomial Expansion.
  • Finally: put this together: $$\begin{align}\mathsf M_{N,X}(s,t)&=\mathsf E(\mathrm e^{sN+tX})&&\text{by definition}\\&=\mathsf E(\mathrm e^{sN}\mathsf E(\mathrm e^{tX}\mid N))&&\text{by tower rule}\\&=\mathsf E(\mathrm e^{sN}\mathsf M_{X\mid N}(t))&&\text{by definition}\\ &= \phantom{\mathsf E(\mathrm e^{sN}(1-0.6+0.6\mathrm e^t)^N)}&&\phantom{\text{as per above}}\\ &=\phantom{\mathsf E(\mathrm e^{(s+\ln(1-0.6+0.6\mathrm e^t))N})}&&\phantom{\text{by algebra}}\\&=\phantom{\mathsf M_N(s+\ln(1-0.6+0.6\mathrm e^t))}&&\phantom{\text{by definition}}\\&=\mathrm e^{6(\mathrm e^{s}(1-0.6+0.6\mathrm e^t)-1)}&&\phantom{\text{as per above}}\end{align}$$