How to prove this moment generating function's property?

147 Views Asked by At

The moment generating function of a random variable $X$ is defined, for all $\theta \in \mathbb{R}$, by: $M_{X}(\theta) := E(e^{\theta X})$. We denote $\Lambda_{X}(\theta) = \log(M_{X}(\theta))$.

From Cramer-Chernoff theorem, we know that: $\Lambda_{X}^{*}(t) = \sup(\theta t - \Lambda_{X}(\theta))$ for $\theta > 0 $

How to prove that, if $X_1,\ldots,X_n$ are random variables and if $X=X_1+\cdots+X_n$ then $\Lambda_{X}^{*}(t) = n \Lambda_{X_1}^{*}(\frac{t}{n})$ for all $t \in \mathbb{R}$?

How to prove that? I didn't figure out any formula or property, which would lead to this.

1

There are 1 best solutions below

0
On

As @Semiclassical pointed out in a comment, you probably want to assume that the random variables are independent. Moreover, I will also assume that they are identically distributed.

Now, let $Y = X_1 + \ldots + X_n$, and note that by the independence assumption we have that the moment generating function for $Y$ is given by $M_Y(\theta) = M_{X_1}(\theta) \ldots M_{X_n}(\theta)$. This means that $$\Lambda_{Y}(\theta) = \log(M_{X_1}(\theta)) + \ldots + \log(M_{X_n}(\theta)) = n \log(M_{X_1}(\theta)),$$ where the last equality follows by the assumption that the variables are identically distributed.

Next, the transformation $f^*(t) := \sup(x t - f(x))$ is known as the convex conjugate (or Fenchel conjugate) of the function $f(x)$. A direct computation (which is also a well-known result for convex conjugates) gives that $$ (a f)^*(t) := \sup_x(x \ t - af(x)) = \sup_x(a (x \ t/a - f(x))) = a \sup_x(x \ t/a - f(x)) = af^*(t/a) $$ Applying this result gives you your sought expression, namely that $$ \Lambda_{Y}^*(t) = (n \Lambda_{X_1}(t))^* = n \Lambda_{X_1}^*(t/n). $$