I’m trying to prove the following claim:
Let $X$ be a nonnegative random variable, and let $\phi(t) = \mathbb E\left[e^{itX}\right]$ be its characteristic function. Suppose $\phi$ is differentiable at $t=0$. Then $\mathbb E[X] < \infty$.
This is a component of an exercise in my probability theory textbook (Achim Klenke, “Probability Theory: A Comprehensive Course”, Exercise 15.4.4(iii)).
This question has also been asked here, and there’s an answer that supposedly gives a counterexample, but I’m not convinced: the counterexample is a symmetric random variable instead of nonnegative, and it’s not clear to me how the characteristic function of a a symmetric random variable relates to the corresponding nonnegative random variable (i.e. its absolute value).
What I’ve tried: Klenke has a proof of the existence of $2n^\textrm{th}$ moments when $\phi^{(2n)}(0)$ exists for some $n \geq 1$, but I’ve been having trouble adapting the proof to the first derivative case. I tried considering a symmetric random variable $Y$ for which $Y^2 = X$ (and thus $\mathbb E[Y^2] = \mathbb E[X]$). If I were able to prove $\phi_Y’’(0)$ exists, then I’d be done. But letting $f(x) = x^2$, and noting that $f_* \mathbb P_Y = \mathbb P_X$, I ended up computing: $$ \phi_Y(t) = \int_{\mathbb R} e^{ity} \mathbb P_Y[dy] = \mathbb P[Y=0] + 2\int_{(0,\infty)} \cos\left(t\sqrt x\right) \mathbb P_X[dx] $$ and it’s not at all obvious to me that this map should be differentiable (let alone twice differentiable).
Any suggestions/places to find the answer?
Your claim is correct. By [1] Section XVII.2a (page 565), for any random variable $X$ with transform $\phi(t)=E[e^{itX}]$, the existence of the derivative $\phi'(0)$ implies that the limit $$\lim_{r \to \infty} E[X \cdot{\bf 1}_{|X| \le r}]$$ exists and is finite. Of course if $X \ge 0$ a,s, then this limit equals $E[X]$ by monotone convergence.
[1] Feller, William. An introduction to probability theory and its applications, vol 2. John Wiley & Sons, 1971 (republished 2008.)