It's on P91, probability theory and examples, version 5, by Rick Durrett. Although it's a section about large deviation, I think my question is universal for moment generating functions.
Define $\varphi(\theta) = E(e^{\theta X})$. Let $\theta_+ = \sup \{\theta : \varphi(\theta) < \infty\}$ and $\theta_- = \inf \{\theta : \varphi(\theta) < \infty\}$.
Why $\varphi(\theta) < \infty$ in $(\theta_-, \theta_+)$? I think the reason is my proof in the second question. But I want to confirm it.
Assume $\theta_+ > 0$, can I prove $(0, \theta_+) \subset (\theta_-, \theta_+)$? My idea is the following. Let $\max\{\theta_-,0\} < \theta_0 < \theta_+$. Note that $e^{\theta x} \leq 1+e^{\theta_0 x}$, for all $0 < \theta < \theta_0$. Thus $\varphi(\theta) \leq 1 + \varphi(\theta_0) < \infty$.
If this is true, I am confused about the proof of Lemma 2.7.2, which discusses $0<\theta < \theta_-$. Because it's unnecessary.
The proof of Lemma 2.7.2 is in another question. Lemma 2.7.2 in probability theory and examples by Rick Durrett
I don't have the book in front of me, but obviously, you can show one side, and say the other side goes analogously. You have showed the positive side, your conclusion is (correctly) for $0<\theta<\theta_0$. You have not concluded yet the statement for the negative theta.
Assume $\phi(\theta) < 0$ for some $\theta>0$. Then you have already showed that $\phi(t) < 0$ for all $t \in [0, \theta]$.
For negative theta $\theta<0$ and $\phi(\theta)<0$, just consider another random variable $Y=-X$. Your proof for positive $\theta$ will now imply that $\phi(t)<0$ whenever $t\in[\theta, 0]$.
In summary, show the finiteness for one theta of given sign (e.g. positive) implies finiteness for all $t$ of the same sign and smaller absolute value. Then the other sign follows easily.