Relation between characteristic function and mean

1.6k Views Asked by At

Let $X$ be a random variable and $\phi$ is its characteristic funciton.

Does $\phi'(0)=ia$ mean $EX=a$?

If not, is there any counterexample?

I think the counterexample will exist in Cauchy random variable because its mean does not exist but its characteristic exists $\phi(t)=e^{-|t|}$. But this does not have a derivative at $0$.

1

There are 1 best solutions below

4
On BEST ANSWER

In the brilliant book of W.Feller "An Introduction to Probability Theory and its Applications", in Vol.2, Chapter XVII, paragraph 2a, you can find the proof of the following theorem (A.Zygmund (1947), E.J.Pittman (1956)):

Theorem. Each of the following three conditions implies the other two.

  1. $\phi'(0)=i\mu$.
  2. As $t\to\infty$,

$$t\left[1-F(t)+F(-t)\right]\to 0, \quad \int_{-t}^t xF\{dx\}\to \mu. $$

  1. The average $(X_1+\dots+X_n)/n$ tends in probability to $\mu$.

So, we can simply construct a pdf that satisfies (2) but s.t. the expectation does not exist. Say, we can consider symmetric distribution with cdf $$ F(t)=\begin{cases}\dfrac{e}{2|t|\ln(|t|)} & \text{if } t<-e\\\quad \dfrac12 & \text{if } -e\leq t\leq e\\ 1-\dfrac{e}{2t\ln(t)} & \text{if } t>e\end{cases} $$ For $t>e$ $$t\left[1-F(t)+F(-t)\right]=\dfrac{e}{\ln(t)}\to 0 \text{ as } t\to\infty, $$ and symmetry implies that $$\quad \int_{-t}^t xF\{dx\} =0 \text{ for any } t>0.$$

So, the condition (2) of the Theorem is fulfilled with $\mu=0$ and $\phi'(0)=0$.

But the expectation does not exist since $\left[1-F(t)+F(-t)\right]$ is not integrable on $\mathbb R_+$. You can see it directly: the pdf looks like $$ f(t)=\frac{e}{2}\cdot\dfrac{\ln(|t|)+1}{t^2\ln^2(|t|)}, \ |t|>e, $$ and $$ E|X|=2\int_e^\infty tf(t)dt =e\int_e^\infty \dfrac{\ln(t)+1}{t\ln^2(t)} \, dt = e\int_e^\infty \dfrac{\ln(t)+1}{\ln^2(t)} \, d(\ln(t))=+\infty. $$