Deducing expectation from characteristic function without using Lévy's formula

1.9k Views Asked by At

Consider a random variable $ X:[a,b]\rightarrow \mathbb R^+ $. We do not know the distribution of $ X $ but we know its characteristic function $ \phi_X(\ ) $.

I know that with Lévy's formula I can find the distribution function and with it prove or disprove the existence of the expected value, but how can I do this without using Lévy's formula?

1

There are 1 best solutions below

0
On

Theorem(or Exercise): $X$ be a r.v. with distri. func. $F(.)$ and c.f. $\phi(.)$ and suppose that $E|X|^n<\infty$ for some integer $n\geq1$. THEN $\forall k=1,...,n$

1) $\phi$ has uniformly continuous derivatives $\phi^{(k)}$ and $$\phi^{(k)}(t)=i^k E[X^k.e^{itX}] $$

2) $E(X^k)=\frac{\phi^{(k)}(0)}{i^k}$

Partial converse: $X$ be a r.v. with distri. func. $F(.)$ and c.f. $\phi(.)$ and suppose that $\phi^{(n)}(0)$ exists (and is finite) for some even integer $n\geq 2$ then $E(X^n)<\infty$

Interesting Example: Let $X$ be a r.v. with $$ P(X=n)=P(X=-n)=\frac{c}{n^2\log n}\hspace{10pt}n=2,3,...$$ Then one can show $\phi^{(1)}(0)=0$ (If you take it for granted of interchanging differentiation and summation in this scenario then the computation is not at all hard) i.e. $\phi^{(n)} $ exists where $n$ is indeed odd. But you can also show $E|X|=\infty$

For proof (which is long and uses DCT) you can have a look at p.295 of

  • Probability Theory: Independence, Interchangeability,Martingales by Yuan Shih Chow Henry Teicher