Expectation and Variance using Moment Generating Functions

1.2k Views Asked by At

Use moment generating functions to verify the following:
The expected value of the sum of independent random variables is the sum of the expected values.

Initially I thought to use the property that if $X$ and $Y$ are independent, then $M_{X+Y}(s)=M_X(s)M_Y(s)$

This property states that the sum of independent variables can be computed using the individual product of their moment generating function.

Using the property above $$M_{X+Y}(s)=\int _{-\infty}^{\infty}\int _{-\infty}^{\infty}e^{s(x+y)}f_{X,Y}(x,y)dxdy$$ $$=\int _{-\infty}^{\infty}\int _{-\infty}^{\infty}e^{sx+sy}f_{X,Y}(x,y)dxdy$$ Not sure if this step is correct, since the two r.v. are independent their joint density can be expressed as their marginal i.e. $$=\int _{-\infty}^{\infty}\int _{-\infty}^{\infty}e^{sx+sy}f_X(x)f_Y(y)dxdy$$ Now we can derive $$=\frac{d}{ds}|s=0\int _{-\infty}^{\infty}\int _{-\infty}^{\infty}e^{sx+sy}f_X(x)f_Y(y)dxdy$$ $$=\int _{-\infty}^{\infty}\int _{-\infty}^{\infty}(x+y)f_X(x)f_Y(y)dxdy$$

Not sure if I am splitting this integral correctly, $$=\int _{-\infty}^{\infty}xf_X(x)dx+ \int _{-\infty}^{\infty}yf_Y(y)dy$$ $$=E(X)+E(Y)$$

I suppose if this is correct, I can do the same on of the RHS of the equation.

2

There are 2 best solutions below

2
On

Hint: In addition to using the property (due to independence of $X$ and $Y$)

$$M_{X+Y}(s) = M_{X}(s) M_{Y}(s) $$

which you point out, also use the property of MGF (the reason behind the name moment "generating" function) that successive derivatives evaluated at $s=0$ give the successive moments. In particular, consider taking the first derivative of the above expression on both sides (right side needs product rule), evaluate them at 0, and then match.


EDIT (May 2nd): It's perhaps best to work out the simple details instead of convoluted answers. This problem is a nice simple illustration of the properties of moment generating functions and as such the beauty shouldn't get lost going back to the definition/sums/integrals/etc. So, here we go:

Derivative of the left side $M'_{X+Y}(0)$ is $E(X+Y)$. Taking the derivative on the right side using the product rule and evaluating at $s=0$, we get:

$$M'_{X}(0) \times M_{Y}(0) \, + \, M_{X}(0) \times M'_{Y}(0) = EX \times 1 + 1 \times EY = EX + EY$$

since $M_{X}(0)=1$ and $M'_{X}(0)=EX$ for any moment generating function.

QED.

0
On

Given that the random variables $X$ and $Y$ are independent. Then, by the definition of MGF, we have, \begin{eqnarray*} M_{X+Y}(t)&=&E\left(e^{t(X+Y)}\right)=E(e^{tX})\cdot E(e^{tY})\qquad\text{(by independence)}\\ &=&E\left[ 1 + tX + \dfrac{t^2 X^2}{2!} + \cdots\right]\cdot E\left[ 1 + tY + \dfrac{t^2 Y^2}{2!} + \cdots\right]\\ &=&E\left[ 1 + tX + \dfrac{t^2 }{2!}X^2 + \cdots \right.\\ & &\quad\qquad tY +t^2 XY + \dfrac{t^3}{2!}X^2 Y + \cdots \\ & &\left.\quad\qquad \dfrac{t^2 Y^2}{2!} + \dfrac{t^3}{2!}XY^2 +\cdots \right] \\ M_{X+Y}^{'}(t)|_{t=0}&=& E(X+Y) = E(X)+E(Y) \end{eqnarray*}