About the conditions of a theorem involving moment generating functions

45 Views Asked by At

According to the book "Mathematical Statistics" written by Wiebe R. Pestman, we have the following theorem:

Let $X$ be a random variable and $(X_n)_{n\in\mathbb{N}}$ a sequence of random variables. Suppose the moment generating functions $M_X$ and $M_{X_n}$ exists (for all $n\in\mathbb{N}$) on a common interval $(-\infty ,\xi ]$ and suppose that $\lim_{n\to\infty}M_{X_n}(t)=\lim_{n\to\infty}M_X(t)$ for all $t\in(-\infty,\xi ]$. Then the sequence $(X_n)_{n\in\mathbb{N}}$ converges in distribution to $X$.

My question is: Is it really necessary for both $M_X(t)$ and $M_{X_n}(t)$ to be finite in the interval $(-\infty,\xi]$?


The book I mentioned doesn't provide a proof of the previous theorem. So, could please provide a reference that contains a proof of that theorem?

The book I mentioned provides a reference, but I don't have access to it.


Thank you for your attention!