Determining a random variable through the Taylor expansion of its moment generating function

742 Views Asked by At

Let $X$ be a random variable defined on a compact set $K\subset \mathbb{R}$. The moment generating function (MGF) of $X$, denoted as $M_X(t), t\in \mathbb{R}$, is defined as $$M_X(t) = \mathrm{E} [e^{tX}] = \int_K e^{tX}d\mathbb{F}(x).$$

From the Wikipedia page, one can compute the expansion of

$\begin{align*} M_X(t) = \mathrm{E} [e^{tX}] &= 1 + tE(X) + \frac{t^2 E(X^2)}{2!} + \cdots \\ & = 1 + tm_1 + \frac{t^2 m_2}{2!} + \cdots, \end{align*}$

where $m_k = E(X^k)$ is the $k$-th moment of $X$. However, I have a few questions regarding MGFs and using MGF to determine $X$.

  1. What is the region of convergence (ROC) of the above Taylor expansion? I suppose that this is related to $E(X^k), k = 1, 2, \ldots$. Are there sufficient and necessary conditions on $\{E(X^k)\}_{k = 1}^\infty$ so that the ROC of $M_X(t)$ has a positive radius?

  2. I remember a statement that `an MGF uniquely determines a random variable'. My question is that, when we say two MGFs equal, do we automatically imply that the ROCs of these two MGFs are the same? Is it possible to have two MGFs agree on an interval, but with different ROC?

  3. Let us consider another random variable $Y$ as a function of $X$. Then the expectation of $Y^k$ is computed as $\mathrm{E}(Y^k) = \int_K Y^k(X) d\mathbb{F}(X)$. The MGF of $Y$, denoted as $M_Y(t)$, can be computed by $$\begin{align*} M_Y(t) = \mathrm{E} [e^{tY}] &= 1 + tE(Y) + \frac{t^2 E(Y^2)}{2!} + \cdots. \end{align*}$$ From my understanding, the '$k$-th moment' of $Y$ should be $Y^k$ integrated with respect to the distribution of $Y$. So I don't see any reason for $\mathrm{E}(Y^k)$ to be called as the `$k$-th moment' of $Y$ in this case. Then what is $\mathrm{E}(Y^k)$ called? Is $M_Y(t)$ still called the 'moment generating function'? Does $M_Y(t)$ still uniquely determine $Y$?

2

There are 2 best solutions below

0
On

As a preamble to the following answers, I note that if the moment generating function of a random variable has a positive radius of convergence, it uniquely detetmines the distribution of that random variable, but not the random variable itself, because there are always many different random variables with any given distribution.

  1. Since $\ K\ $ is compact, it is bounded. If $\ B\ge1\ $ is a bound for $\ K\ $, then $\ \displaystyle\left|m_k\right|=\left|\,\int_Kx^kd\mathbb{F}(x)\,\right|\le B^k\ $, so $\ \displaystyle\sum_{k=0}^\infty \frac{m_kt^k}{k!}\ $ converges for all $\ t\in\mathbb{C}\ $ by comparison with the series for $\ e^{Bt}\ $. That is, the radius of convergence is always infinite. I'm not aware of any simple necessary and sufficient conditions for the radius of convergence to be positive in the case when $\ K\ $ is not bounded.
  2. No. Two power series whose values are the same over an interval of positive length must have the same coefficients and therefore the same radius of convergence.
  3. If $\ \mathbb{G}\ $ is the distribution of $\ Y\ $, then $\ \displaystyle\int_KY(x)^kd\mathbb{F}(x)= \int_{Y(K)}y^kd\mathbb{G}(y)=E(Y^k )\ $, and it is immaterial whether you use the first or the second integral to calculate $\ E(Y^k )\ $. You'll get the same moment generating function in either case, so there's no reason to call it anything other than "the moment generating function of $\ Y\ $", and provided its radius of convergence is positive, it will still uniquely determine $\ \mathbb{G}\ $, the distribution of $\ Y\ $, but not $\ Y\ $ itself.
2
On

For the first question:

See Theorem 5 [1].

Let $Y$ be a random variable. Let $R$ denote the radius of convergence of the series $\sum_{k=0}^\infty \frac{t^k}{k!}\mathbb{E}[Y^k]$ given by $$R = \frac{1}{\limsup_{n\to \infty} \sqrt[n]{\frac{1}{n!}|\mathbb{E}[Y^n]|}}.$$ Let $R' = \sup\{t > 0: \ \mathbb{E}[\mathrm{e}^{tY}] < \infty, \ \mathbb{E}[\mathrm{e}^{-tY}] < \infty\}$.

Fact 1: If $\mathbb{E}[|Y|^k]$ exists (finite) for $k \ge 1$, and $R > 0$, then $R' = R$, and $\mathbb{E}[\mathrm{e}^{tY}] = \sum_{k=0}^\infty \frac{t^k}{k!}\mathbb{E}[Y^k]$ for $t$ with $|t| < R$.

Fact 2: If $R' > 0$, then $\mathbb{E}[|Y|^k]$ exists (finite) for $k \ge 1$, and $R = R'$, and $\mathbb{E}[\mathrm{e}^{tY}] = \sum_{k=0}^\infty \frac{t^k}{k!}\mathbb{E}[Y^k]$ for $t$ with $|t| < R'$.

For your problem, consider the series $$\sum_{k=0}^\infty \frac{t^k}{k!}\mathbb{E}[X^k].$$ Its radius $R$ of convergence is given by $$R = \frac{1}{\limsup_{n\to \infty} \sqrt[n]{\frac{1}{n!}|\mathbb{E}[X^n]|}} = \frac{1}{\limsup_{n\to \infty} \frac{\mathrm{e}}{n}\sqrt[n]{|\mathbb{E}[X^n]|}} = \infty$$ where we have used $\sqrt[n]{|\mathbb{E}[X^n]|}\le B$ for some constant $B > 0$ (since $X$ is defined on some compact subset of $\mathbb{R}$) and Stirling's formula $n! \sim \sqrt{2\pi n}\, n^n \mathrm{e}^{-n}$.

As a result, for any $t\in \mathbb{R}$, it holds that $$\mathbb{E}[\mathrm{e}^{tX}] = \sum_{k=0}^\infty \frac{t^k}{k!}\mathbb{E}[X^k].$$

Reference

[1] https://galton.uchicago.edu/~wichura/Stat304/Handouts/L11.mgf.pdf