I'm currently reviewing some probability materials. I wish to show that the cumulant generating function of a random variable $X$ is lower semicontinuous. That is, for any $t_0$, given a sequence $t_n\rightarrow t_0$,
$\text{liminf } log(M_X(t_n)) \geq log(M_X(t_0))$
where $M_X(t)$ is the moment generating function of $X$.
I can show that the CGF is convex. I figured I may need to use Jensen inequality and conclude the proof somehow with Fatou's lemma but to no avail. Any help or comment is greatly appreciated.
Let $\Lambda$ denote the cumulant-generating function. In fact we have that $\Lambda$ is differentiable in the interior of its effective domain $\mathcal D_\Lambda:=\{u\in\mathbb R:\Lambda(u)<\infty\}$, with derivative $$ \Lambda'(\theta) = \frac{\mathbb E[Xe^{\theta X}]}{e^{\Lambda(\theta)}}. $$ By Hölder's inequality, $$\mathbb E\left[e^{(\alpha\theta_1+(1-\alpha)\theta_2)X}\right] = \mathbb E\left[(e^{\theta_1 X})^\alpha(e^{\theta_2 X})^{1-\alpha} \right]\leqslant \left(\mathbb E\left[e^{\theta_1 X} \right]\right)^\alpha \left(E\left[e^{\theta_2 X} \right]\right)^{1-\alpha},\quad \alpha\in[0,1]. $$ Taking logarithms, we have $$ \log\left(\mathbb E\left[e^{(\alpha\theta_1+(1-\alpha)\theta_2)X}\right]\right) = \Lambda((\alpha\theta_1+(1-\alpha)\theta_2)X) \leqslant \alpha\Lambda(\theta_1X)+(1-\alpha)\Lambda(\theta_2 X), $$ and hence $\Lambda$ is convex. Now fix $t_0\in\mathbb R$ and let $t_n$ be a sequence converging to $t_0$. Then by Fatou's lemma, $$ \mathbb E[e^{t_0 X}]\leqslant\liminf_{n\to\infty}\mathbb E[e^{t_n X}]. $$ Taking logarithms, we have $$ \Lambda(t_0 X)\leqslant\liminf_{n\to\infty}\Lambda(t_n X), $$ which shows that $\Lambda$ is lower semicontinuous.
Now let $\theta$ be such that $\Lambda(\theta)<\infty$. Then $$\lim_{\delta\to 0}\frac1\delta\left(\mathbb E\left[e^{(\theta+\delta)X}\right] -\mathbb E\left[e^{\theta X}\right]\right) = \lim_{\delta\to 0}\mathbb E\left[\frac{e^{(\theta+\delta)X} - e^{\theta X}}\delta \right].$$ Now, $$\lim_{\delta\to0}\frac1\delta \left(e^{(\theta+\delta)X} - e^{\theta X}\right) = Xe^{\theta X}, $$ and $$\frac1\delta \left(e^{(\theta+\delta)X} - e^{\theta X}\right)\leqslant \frac1\varepsilon e^{\theta X}\left(e^{\varepsilon|X|}-1\right)=:Z_\varepsilon $$ for all $\delta\in(-\varepsilon,\varepsilon)$. Since $\theta$ is in the interior of the effective domain of $\Lambda$, we may choose $\varepsilon>0$ such that $\theta+\varepsilon$ and $\theta-\varepsilon$ are in the interior of $\mathcal D_\Lambda$. Hence by the dominated convergence theorem, $$ \lim_{\delta\to0}\mathbb E\left[\frac{e^{(\theta+\delta)X}-e^{\theta X}}{\delta}\right] = \mathbb E[Xe^{\theta X}]. $$ It follows that $\frac{\mathsf d}{\mathsf d\theta}\mathbb E[e^{\theta X}] = \mathbb E[Xe^{\theta X}]$. Since $\Lambda(\theta) = \log\mathbb E[e^{\theta X}]$, by the chain rule we have $$ \Lambda'(\theta) = \frac{\mathbb E[Xe^{\theta X}]}{e^{\Lambda(\theta)}}. $$