Moment-generating functions of random variables

455 Views Asked by At

Textbook question:

Find the MGF of the random variable $X$ with PMF $$p_{X}(x) = pe^{-\lambda}\dfrac{\lambda^{x}}{x!}+(1-p)e^{-\mu}\dfrac{\mu^{x}}{x!}\text{, }\quad x = 0, 1, \dots$$ where $\lambda$ and $\mu$ are positive scalars, and $p$ satisfies $0 \leq p \leq 1$.

Hi guys, I'm new to MGFs and such and I need help with this question. I know that we're dealing with Poisson distributions, and we also have $p$, and the compliment $(1-p)$, but I don't know how to put this information together.

I do know that the MGF of a Poisson distribution is given by: $M_x(t) = $ $e^{ \lambda (e^{t} - 1)}$

But I don't know what to do when the distribution has a scalar coefficient.

Would it simply be something like: $M_x(t) = $ $p(e^{ \lambda (e^{t} - 1)}) + (1-p)(e^{ \lambda (e^{t} - 1)}) $?

3

There are 3 best solutions below

0
On BEST ANSWER

By definition, $$\begin{align} M_{X}(t) = \mathbb{E}\left[e^{tX}\right] &= \sum\limits_{x=0}^{\infty}e^{tx}p_{X}(x)\\ &= \sum\limits_{x=0}^{\infty}e^{tx}\left[pe^{-\lambda}\dfrac{\lambda^{x}}{x!}+(1-p)e^{-\mu}\dfrac{\mu^{x}}{x!}\right] \\ &= \sum\limits_{x=0}^{\infty}e^{tx}pe^{-\lambda}\dfrac{\lambda^{x}}{x!} + \sum\limits_{x=0}^{\infty}e^{tx}(1-p)e^{-\mu}\dfrac{\mu^{x}}{x!} \\ &= p \underbrace{\sum\limits_{x=0}^{\infty}e^{tx}e^{-\lambda}\dfrac{\lambda^{x}}{x!}}_{A} + (1-p)\underbrace{\sum\limits_{x=0}^{\infty}e^{tx}e^{-\mu}\dfrac{\mu^{x}}{x!}}_{B}\text{.} \end{align}$$ Notice that $A$ is the MGF of a Poisson with mean $\lambda$ and $B$ is the MGF of a Poisson with mean $\mu$. So $$M_{X}(t) = pe^{\lambda(e^{t}-1)}+(1-p)e^{\mu(e^{t}-1)}\text{.}$$

0
On

The moment generating function of $X$ is $E(tX)$. So we can quickly get an expression for the moment generating function of $X$. It is given by $$M_X(t) =\sum_{x=0}^\infty e^{tx}\left(pe^{-\lambda} \frac{\lambda^x}{x!}+(1-p)e^{-\mu} \frac{\mu^x}{x!}\right).$$ We recognize this sum as $$pM_U(t)+(1-p)M_V(t),$$ where $U$ is Poisson with parameter $\lambda$ and $V$ is Poisson with parameter $\mu$. But you know $M_U(t)$ and $M_V(t)$.

0
On

You can think of the situation this way. You have two independent Poisson random variables $Y_1$ and $Y_2$, with parameters $\lambda$ and $\mu$. You have a Bernoulli random variable $Z$ with parameter $p$, independent of $Y_1$ and $Y_2$. Then $X = Z Y_1 + (1-Z) Y_2$. $Z$ corresponds to a toss of a biased coin with probability $p$ of "heads". You toss your coin, and if it comes up "heads" you take $Y_1$ as the value of your random variable $X$, otherwise you take $Y_2$.

Now you want $M_X(t) = \mathbb E[e^{tX}]$. Condition on the value of $Z$. Given $Z = 1$, $X = Y_1$, so $$\mathbb E[e^{tX} \mid Z = 1] = \mathbb E[e^{tY_1}] = M_{Y_1}(t) = e^{\lambda (e^t-1)}$$ Similarly, $$ \mathbb E[e^{tX} \mid Z = 0] = \mathbb E[e^{tY_2}] = M_{Y_2}(t) = e^{\mu (e^t-1)}$$

Therefore $$ M_X(t) = \mathbb P(Z=1) \mathbb E[e^{tX} \mid Z = 1] + \mathbb P(Z=0) \mathbb E[e^{tX} \mid Z = 0] = p e^{\lambda(e^t-1)} + (1-p) e^{\mu(e^t-1)}$$