The following is a deleted question from jxm
All of the results I have studied about moment-generating functions concern the case in which it is finite only on a neighbourhood of zero. There is a result that relates this fact with the growth of the norms: if $X$ is a random variable, then $\|X\|_n\leq Bn$ for certain $B>0$ and every $n\geq1$, if and only if $E[e^{tX}]<\infty$ for $t$ in a neigbourhood of zero.
My question is about the following implication: if $E[e^{tX}]<\infty$ for all $t\in\mathbb{R}$, then $\|X\|_n\leq Bn^r$, for certain $B>0$ and certain $0\leq r<1$. Is this a known result in the literature?
I had a think about this, and was wondering
How does one derive the bounds described here for the MGF finite on a finite interval containing $0$?
What happens in the case the question is asking about? Is there anything better than $O(n)$ in general?
MGF finite on $ (a,b) \ni 0 $
Write $c = \min{\{-a,b\}}$ so that $M_X(s)$ certainly exists for $s < c$. We consider first the case of even $n=2m$ We know that $ s^{2m} X^{2m}/(2m)! < \cosh{sX} $, so taking expectations, $$ E[X^{2m}] = \frac{(2m)!}{2s^{2m}} (M_X(s)+M_X(-s)) $$ It will now suffice to examine $(2m)!$: we have $$ \log{(2m)!} = \sum_{k=1}^{2m} \log{k} < \int_1^{2m+1} \log{x} \, dx = (2m+1)(\log{(2m+1)}-1) , $$ so $$ (2m)! < (2m+1)^{2m+1} e^{2m+1} , $$ and taking the $2m$th root gives $$ \lVert X \rVert_{2m} = 2^{-1/2m} s^{-1} ((2m)!)^{1/2m} (M_X(s)+M_X(-s))^{1/2m} < s^{-1} (M_X(s)+M_X(-s)) (2m+1)^{1+1/2m} e^{1+1/2m} < 2mB $$ for appropriate $B$.
For the odd terms, by Hölder's inequality and AM–GM we have $$ \lVert X \rVert_{a} \leq \lVert X \rVert_{b}^{1-\theta} \lVert X \rVert_{c}^{\theta} \leq (1-\theta)\lVert X \rVert_{b} + \theta \lVert X \rVert_{c} $$ where $$ \frac{1}{a} = \frac{1-\theta}{b} + \frac{\theta}{c} ; $$ putting $a,b,c = n,n-1,n+1$ gives $$ \frac{1}{n} = \frac{1-\theta}{n-1} + \frac{\theta}{n+1} $$ so $\theta = (n+1)/(2n)$. Then $$ (1-\theta)\lVert X \rVert_{2m}+ \theta \lVert X \rVert_{2m+2} < (1-\theta) (2mB) + \theta (2m+2)B = B(2m+1 + 1/(2m+1)) < 2B(2m+1), $$ so a bound for the even terms also gives (possibly slightly worse) bound for the odd terms; no doubt this could be further improved, but there's not much point since I didn't look for the best bound possible for the even case anyway.
MGF finite on $\mathbb{R}$
For ease of calculation, every distribution here is supported on $[0,\infty)$. (This doesn't make much difference since we are interested in the absolute value anyway.)
Sadly the answer is no: one naturally derives some hope from densities that are $O(e^{-x^{r}})$ for $r>1$: a straightforward calculation reveals that in such cases $$ E[X^n] = O(\Gamma((n+1)/r)/r) , $$ and $$ \log{\Gamma(\frac{n+1}{r})} - \log{r} \sim \frac{n}{r}(\log{n} - 1) , $$ whence $ E[X^n]^{1/n} = O( n^{1/r} ) $.
Naturally we need to consider something more borderline than this to find a counterexample. An obvious choice is $ e^{-x\log{x}} $. Is the MGF finite on $\mathbb{R}$? Yes: if $t<0$ this is obvious, while if $t \geq 0$, $-x\log{x}+tx $ is eventually less than a constant negative multiple of $-x$.
To compute the leading asymptotic for $E[X^n]$, we need to use Laplace's Method. The function in the exponential is $\phi(x) = -x\log{x}+n\log{x}$, so $$ \phi'(x) = -1 + \log{x} + \frac{n}{x} , $$ which unfortunately needs the Lambert $W$-function to solve, namely $x_0= n/W(en)$. We recall that $We^W = x$, and thus also $$ W + \log{W} = \log{x}. $$ We eventually find that $$ \phi(x_0+u) = n(W(en)-1+1/W(en)) + \frac{W(en)(1+W(en))}{2n} u^2 +O(u^3) , $$ and thus substitute $u = v \sqrt{n/(W(1+W))} $ to find that $$ E[X^n] \sim \sqrt{\frac{2\pi n }{W(en)(1+W(en))}} \exp{(n(W(en)-1+1/W(en)))} . $$
Then, $$ \begin{align} \log{\lVert X \rVert_n} ={}& \frac{1}{n}\log{E[X^n]} = W(en) - 1 + \frac{1}{W(en)} \\ &+ \frac{1}{2n} ( \log{n} + \log{W(en)} + \log{(1+W(en))} + \log{2\pi}) . \end{align} $$
Now, for large $x$, $\log{W} \ll W $, so $ W = \log{x} - \log{W} = \log{x} - \log{(\log{x}+o(\log{x}))} $, so we see in particular that all terms apart from the $W(en)-1$ converge to zero, whence $$ \log{\lVert X \rVert_n} \sim \log{n} \implies \lVert X \rVert_n = O(n) , $$ rather than any smaller power of $n$.