It is known that $X$ has a $b(Y,p)$ distribution, where $Y$ is a random variable which has a $P(λ)$. Find the moment generating function of $X$. --I know that the MGF of a random variable is $E[e^{tX}]$, but the second random variable inside is throwing me off.
2026-03-28 02:41:06.1774665666
On
Mathematical Statistics Question
193 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
The random variable $X$ may be viewed as $$X = \sum_{k=1}^Y B_k$$ where $B_k$ are iid Bernoulli random variables. Then the PGF of X is given by $$P_X(s)=G_Y(H_B(s))$$ where $G(\cdot)$ is the PGF of Poisson Distribution and $H(\cdot)$ is the PGF of Bernoulli distribution. \begin{eqnarray*} P_X(s)&=&\exp(\lambda((1-p)+ps-1))\\ &=&\exp(\lambda p(s-1)) \end{eqnarray*} which is the PGF of Poisson Distribution with parameter $\lambda p$.
$\newcommand{\E}{\operatorname{E}}$$\#1$ is a standard exercise. It ought to be phased as $X\mid Y \sim \operatorname{Binomial}(Y,p),$ i.e. the conditional distribution of $X$ given $Y$ is $\operatorname{Binomial}(Y,p),$ rather than that that is simply the distribution of $X$. The distribution of $X$, i.e. the marginal (i.e. "unconditional") distribution of $X$, is supported on the infinite set $\{0,1,2,3,\ldots\},$ as a moment's thought will tell you. \begin{align} \Pr(X=x) = \E(\Pr(X=x\mid Y)) {} & = \sum_{y = 0}^\infty \Pr(X=x\mid Y=y) \Pr(Y=y) \\[10pt] = {} & \sum_{y=x} ^\infty \Pr(X=x\mid Y=y) \Pr(Y=y) \tag a \\[6pt] & \quad (\text{This starts at } y=x \text{ rather than at } y=0 \\ & \quad \phantom{(} \text{because } X \text{ cannot be equal to } x \text{ unless } y\ge x.) \end{align}
You know that $\displaystyle\Pr(Y=y) = \frac{e^{-\lambda} \lambda^y}{y!}.$
And that $\displaystyle \Pr(X=x \mid Y=y) = \binom y x p^x (1-p)^{y-x}.$
Plugging those in to line $(\textbf{a})$ above, we get. $$ \sum_{y=x}^\infty \binom y x p^x (1-p)^{y-x} \cdot \frac{e^{-\lambda} \lambda^y}{y!}. $$ If we let $w=y-x$, then as $y$ goes from $x$ to $\infty$, then $w$ goes from $0$ to $\infty$, and where we see $y-x$ we put $w$, and where we see $y$ alone we put $w+x$, getting $$ \sum_{w=0}^\infty \binom {w+x} x p^x (1-p)^w \cdot \frac{e^{-\lambda} \lambda^{w+x}}{(w+x)!}. $$ This is \begin{align} \sum_{w=0}^\infty \frac{(w+x)!}{w!x!} p^x (1-p)^w \cdot \frac{e^{-\lambda} \lambda^{w+x}}{(w+x)!} = \sum_{w=0}^\infty \left( \frac{(p\lambda)^x}{x!} e^{-p\lambda} \right) \left(\frac{((1-p)\lambda)^w}{w!} e^{-(1-p)\lambda} \right). \end{align} The factor $\displaystyle \left( \frac{(p\lambda)^x}{x!} e^{-p\lambda} \right)$ does not change as $w$ goes from $0$ to $\infty$, so it can be pulled out, getting $$ \left( \frac{(p\lambda)^x}{x!} e^{-p\lambda} \right) \sum_{w=0}^\infty \left(\frac{((1-p)\lambda)^w}{w!} e^{-(1-p)\lambda} \right). $$ The sum is clearly equal to $1$ (since it's the sum of all the probabilities from the $\operatorname{Poisson}((1-p)\lambda)$ distribution). So finally we have $$ \Pr(X=x) = \frac{(p\lambda)^x}{x!} e^{-p\lambda}, $$ or in other words $$ X\sim\operatorname{Poisson}(p\lambda). $$
PS: The question has been revised. What I did above should do it if you know the mfg of the Poisson distribution, but here's a way that is more in the spirit of the way the question is phrased: \begin{align} \E(e^{tX}) & = \E(\E(e^{tX}\mid Y)) = \E( (1 - p + pe^t)^Y) \\[10pt] & = \sum_{y=0}^\infty (1-p+pe^t)^y \frac{e^{-\lambda} \lambda^y}{y!} = e^{-\lambda} \sum_{y=0}^\infty \frac{ \left( (1-p+pe^t) \lambda \right)^y }{y!} \\[10pt] & = e^{-\lambda} e^{-(1-p+pe^t) \lambda }. \end{align} Use a bit of algebra and see if you can get that to look like the mgf of the Poisson distribution with expected value $p\lambda$.