Assume there are many identical and independent sample pairs e.g. $(X_1, Y_1), (X_2, Y_2), (X_3, Y_3), \dots, (X_n, Y_n)$.
How do you consistently estimate the following function $M(t_1, t_2)$, such that the estimator converges in probability to the function?
Function here: $M(t_1,t_2) = \mathbb E\left[e^{t_1X+t_2Y}\right]$ https://i.stack.imgur.com/JOHgT.jpg
$M(t_1, t_2)$ is a bivariate moment-generating function.
Entire question here for reference: https://i.stack.imgur.com/Fxjsv.png
Thank you!
This is a question on generating a moment generating function. We will do it using the properties of the MGF namely: $\mathbb E[X^n] = M_X^n(0)$
Where the superscript indicates the $n$'th derivative.
The final consistent estimator of the moment generating function based on samples $\{(X_1,Y_1),\dots,(X_n,Y_n)\}$ is \begin{align*} M(t_1,t_2) &= M_X(t_1)M_Y(t_2)\\ &= \left(1+\left[\sum\limits_{k=1}^nX_k\right]\frac{t_1}{1!n} +\left[\sum\limits_{k=1}^nX_k^2\right]\frac{t_1^2}{2!n}+ +\left[\sum\limits_{k=1}^nX_k^3\right]\frac{t_1^3}{3!n}+\dots\right)\cdot\\ &\qquad\left(1+\left[\sum\limits_{k=1}^nY_k\right]\frac{t_2}{1!n} +\left[\sum\limits_{k=1}^nY_k^2\right]\frac{t_2^2}{2!n}+ +\left[\sum\limits_{k=1}^nY_k^3\right]\frac{t_2^3}{3!n}+\dots\right)\\ \end{align*}
Claim 1: $M(t_1,t_2) = M_X(t_1)M_Y(t_2)$
Proof: First note that the \begin{align*} M(t_1,t_2) &= \mathbb E\left[e^{t_1X+t_2Y}\right]\\ &= \int\limits_{x=-\infty}^\infty\quad\int\limits_{y=-\infty}^\infty\left[e^{t_1x+t_2y}\right]f_Y(y)f_X(x)dydx\\ &= \int\limits_{x=-\infty}^\infty e^{t_1x}f_X(x)dx \quad\int\limits_{y=-\infty}^\infty e^{t_2y}f_Y(y)dy\\ &= M_X(t_1) M_Y(t_2) \end{align*}
Claim 2: If $M_X(t_1) = 1+a_1t_1 + a_2t_1^2 + \dots+ a_kt_1^k+\dots$,
then $a_k = \frac{M_X^k(0)}{k!}$
Proof: \begin{align*} M_X^k(0)&= k! a_k + (k+1)! a_{k+1}t_1 + \left(\prod\limits_{i=3}^{k+2} i\right) a_{k+2}t_1^2+\dots\bigg|_{t_1=0}\\ &= k! a_k + (k+1)! a_{k+1}0 + \left(\prod\limits_{i=3}^{k+2} i\right) a_{k+2}0^2+\dots\\ &=k! a_k\\ \frac{M_X^k(0)}{k!}&= a_k \end{align*}
Claim 3: $\mathbb E[X^n] = M_X^n(0)$
Follows from property of moment generating functions.
Claim 4/Corollary: $\frac{\mathbb E[X^n]}{k!}= a_k$
Claim 5: We have the following consistent estimators of the moments.
$\qquad\dots$
$\quad$i. $\mathbb E [X^i] = \frac{X_1^i+X_2^i+\dots+X_n^i}{n} = \frac{\sum\limits_{k=1}^nX_k^i}{n}$
$\qquad\dots$
Proof: Estimate for $\mathbb E[X^n]$ is the average of samples of $X^n$
Now we have all the ingredients to solve our problem. \begin{align*} M_X(t_1) &= 1+a_1t_1 + a_2t_1^2+\dots\\ &= 1+\left[\sum\limits_{k=1}^nX_k\right]\frac{t_1}{1!n} +\left[\sum\limits_{k=1}^nX_k^2\right]\frac{t_1^2}{2!n}+ +\left[\sum\limits_{k=1}^nX_k^3\right]\frac{t_1^3}{3!n}+\dots\\ \end{align*} Similarly, \begin{align*} M_Y(t_2) &= 1+\left[\sum\limits_{k=1}^nY_k\right]\frac{t_2}{1!n} +\left[\sum\limits_{k=1}^nY_k^2\right]\frac{t_2^2}{2!n}+ +\left[\sum\limits_{k=1}^nY_k^3\right]\frac{t_2^3}{3!n}+\dots\\ \end{align*} Then by claim 1, \begin{align*} M(t_1,t_2) &= M_X(t_1)M_Y(t_2)\\ &= \left(1+\left[\sum\limits_{k=1}^nX_k\right]\frac{t_1}{1!n} +\left[\sum\limits_{k=1}^nX_k^2\right]\frac{t_1^2}{2!n}+ +\left[\sum\limits_{k=1}^nX_k^3\right]\frac{t_1^3}{3!n}+\dots\right)\cdot\\ &\qquad\left(1+\left[\sum\limits_{k=1}^nY_k\right]\frac{t_2}{1!n} +\left[\sum\limits_{k=1}^nY_k^2\right]\frac{t_2^2}{2!n}+ +\left[\sum\limits_{k=1}^nY_k^3\right]\frac{t_2^3}{3!n}+\dots\right)\\ \end{align*}
Show the consistency:
By definition of moment generating function(MGF), $M(t_1,t_2) = \mathbb E\left[e^{t_1X+t_2Y}\right]$ is an MGF. MGF's satisfy certain properties which we have used to estimate the required function. Specifically, we have constructed the moment generating function from estimates of the moments.
Since the estimates of the moments are consistent, and converge as $n\to\infty$ to the actual moment values, our estimate of the function $M(t_1,t_2)$ is consistent.