Moment Generating Function of a combination of 2 RVs

680 Views Asked by At

The number $N$ of cars sold has a poisson distribution with parameter $m$. Let $T=X_1 + X_2 +\cdots+ X_N$, where $X$ represents the size of the claim with a $\gamma(\alpha,\beta)$. $X_i$ is independently and identically distributed and independent of $N$.

Derive the MGF of $T$.

The $\operatorname{MGF}_T(e^{tT})$ is simply, $\operatorname{MGF}_{X_1 + X_2 +\cdots+ X_N}(e^{t(X_1 + X_2 +\cdots+ X_N)})$

Where the MGF summation/product rule can be used. But we have a random component N in here. How does this affect the whole process?

2

There are 2 best solutions below

3
On

If $S = X_1 + \ldots + X_n$ where $X_1, \ldots, X_n$ are independent, then $MGF_S(t) = \prod_{i=1}^n MGF_{X_i} (t)$.

This follows from $E[f(X)g(Y)] = E[f(X)]E[g(Y)]$ for $X,Y$ independent and $f,g$ arbitrary functions and $e^{a+b} = e^a e^b$.

For $S=X_1 + \ldots + X_N$ where $N$ is random, note that $E[e^{St}\mid N=n] = \prod_{i=1}^n MGF_{X_i} (t)$, so $MGF_S(t) = \sum_{n=1}^\infty E[e^{St}\mid N=n] P(N=n)= \sum_{n=1}^\infty \prod_{i=1}^n MGF_{X_i} (t) P(N=n)$. When they are i.i.d., this is simply $\sum_{n=1}^\infty (MGF_{X_1}(t))^n P(N=n)$.

2
On

$\newcommand{\MGF}{\operatorname{MGF}}\newcommand{\E}{\operatorname{E}}$

You wrote $\MGF_T(e^{tT})$ where you should have had $\MGF_T(t)=\E(e^{tT})$.

Let us assume you meant $\beta$ was the scale parameter rather than the intensity parameter, so the Gamma distribution is $$ \frac 1 {\Gamma(\alpha)} \left( \frac x \beta \right)^{\alpha-1} e^{-x/\beta} \left( \frac{dx} \beta \right)\quad\text{for }x>0. $$ Then \begin{align} & \MGF_X(t) = \E(e^{tX}) \\[6pt] = {} & \int_0^\infty e^{tx} \frac 1 {\Gamma(\alpha)} \left( \frac x \beta \right)^{\alpha-1} e^{-x/\beta} \left( \frac{dx} \beta \right) \\[6pt] = {} & \int_0^\infty \frac 1 {\Gamma(\alpha)} \left( \frac x \beta \right)^{\alpha-1} e^{-x(t + 1/\beta)} \left( \frac{dx} \beta \right) \\[6pt] = {} & \frac 1 {\beta^\alpha\Gamma(\alpha)} \cdot \left( t + \frac 1 \beta \right)^{-\alpha} \int_0^\infty \left(\left( t+\frac 1\beta \right)x\right)^{\alpha-1} e^{-x(1+1/\beta)} \left(\left( t+\frac 1\beta\right)\,dx\right) \\[6pt] = {} & \frac 1 {\beta^\alpha\Gamma(\alpha)} \cdot \left( t + \frac 1 \beta \right)^{-\alpha} \int_0^\infty u^{\alpha-1} e^{-u}\,du \\[6pt] = {} & \frac 1 {\beta^\alpha\Gamma(\alpha)} \cdot \left( t + \frac 1 \beta \right)^{-\alpha} \cdot \Gamma(\alpha) \\[6pt] = {} & (\beta t+1)^{-\alpha}. \end{align} Then for fixed (non-random) $n$ we have $$ \MGF_{X_1+\cdots+X_n}(t) = \E(e^{t(X_1+\cdots+X_n)}) = \MGF(e^{tX_1}\cdots e^{tX_n}) = (\beta t + 1)^{-n\alpha}. $$ Then for $N\sim\mathrm{Poisson}(m)$ we have \begin{align} & \MGF_{X_1+\cdots+X_N} (t) = \E(\E(e^{t(X_1+\cdots+X_N)}\mid N)) = \E((\beta t + 1 )^{-N\alpha}) \\[6pt] = {} & \sum_{n=0}^\infty (\beta t + 1 )^{-n\alpha} \Pr(N=n) = \sum_{n=0}^\infty (\beta t + 1 )^{-n\alpha} \frac{m^n e^{-m}}{n!} \\[6pt] = {} & e^{-m}\sum_{n=0}^\infty \left( (\beta t+1)^{-\alpha} m \right)^n \frac1{n!} \end{align} and you should recall that that last sum has a neat closed-form expression.