differentiability of characteristic function - additional question

1k Views Asked by At

In Wikipedia, we have the theorem about characteristic functions:

Provided the n'th moment exist, the characteristic function can be differentiated n times and:

$$ E[X^n] = i^{-n} \phi_X^{(n)}(0) = i^{-n} \left[ \frac{d^n}{dt^n} \phi_X(t) \right]_{t=0}.$$

My additonal question is:

Is it possible to conclude the opposite, with some restrictions about $X$? If $X$ is a non-negative random variable, can we conclude:

$\phi_X$ is differentiabel in $0$ $\Rightarrow$ The first moment exist?


I have a result which states: Let $(X_n)_{n\in\mathbb{N}}$ be a sequence of identically distributed random variables on $(\Omega, F, P)$, and assume $E[X_1^-]<\infty$ and $E[X_1^+]=\infty$. Then for every $M \in (0, \infty)$ we have, that:

$$\text{min}\{X_1,M\}\in \mathcal{L}^1(P) \text{ and } \frac{1}{n}\sum_{k=1}^{n} \text{min}\{X_k,M\} \geq E[\text{min}\{X_1,M\}]$$

I tried to use this, but haven't gotten far.

2

There are 2 best solutions below

1
On

Here are the details of a counter-example from Chung's book: let $X$ take integer values with $P\{X=n\}=\frac C {n^{2} \ln {|n|}}$ for $|n| >1$, $0$ for $n=0,1,-1$ where $C$ is chosen such that the probabilities add up to 1. The characteristic function is given by $\phi (t) =2C\sum_{n=2}^{\infty} \frac {\cos (nt)} {n^{2} \ln n}$ Let $S_k$ be the k-th partial sum. Then $S_k'=-2C\sum_{n=2}^{k} \frac {\sin (nt)} {n \ln n}$. It is well known that if $\{a_n\}$ decreases to $0$ then $\sum a_n \sin (nx)$ converges uniformly iff $na_n \to 0$ $\cdots$(1). In our case this condition is satisfied. Hence $S_k \to \phi$ uniformly and $S_k' \to g$ uniformly for some function $g$. An elementary calculus argument shows that $\phi$ id differentiable with derivative $g$. [ Write $S_k(x)-S_k (0)=\int_0^{x} S_k'(y)dy$ and take limits to get $\phi (x)-\phi (0) =\int _0 ^{x} g(y)dy$. Being a uniform limit of continuous functions $g$ is continuous so we get $\phi '(x)=g(x)$]. Thus $\phi'(0)$ exist. But $E|X|=\infty$ because $\sum_2^{\infty} \frac 1 {n \ln n} =\infty$ A reference for (1) is p. 112 of Fourier series by R E Edwards It is easy to modify this example to get a non-negative random variable with the same properties. In fact $X$ is symmetrization of a random variable $Y$ taking only the values $2,3,...$ and the characteristic function of $Y$ has the same properties.

0
On

The example from Chung's book cannot be modified to work for a non-negative random variable. By [1], Section XVII.2a (page 565), for any random variable $X$ with transform $\phi(t)=E[e^{itX}]$, the existence of the derivative $\phi'(0)$ implies that the limit $$\lim_{r \to \infty} E[X \cdot{\bf 1}_{|X| \le r}]$$ exists and is finite. Of course if $X \ge 0$ a.s., then this limit equals $E[X]$ by monotone convergence.

[1] Feller, William. An introduction to probability theory and its applications, vol 2. John Wiley & Sons, 1971 (republished 2008.)