A linear combination of characteristic functions is a characteristic function?

4.7k Views Asked by At

Let $\phi_k(t)$ be the characteristic function of a random variable $X_k$, $k = 1,2,\dots$. Consider a set of positive real numbers $\{p_1, p_2, \dots \}$, take a function: $$\phi(t) = \sum_{k=1}^{\infty}p_k\phi_k(t)$$ What is the conditions on $\{p_1, p_2, \dots \}$ such that $\phi$ is a characteristic function?

I know that a characteristic function need to satisfy following properties: $$\phi(0) = 1$$ $$\phi(-t) = \overline{\phi(t)}$$ $$|\phi(t)|=\left|E[e^{itX}]\right|\leq E|e^{itX}|=1$$ $$|\phi(t+h)-\phi(t)|\leq E|e^{ihX}-1|$$ $$E[e^{it(aX+b)}]=e^{itb}\phi(at)$$

But I think only satisfy these properties will not ensure that $\phi(t)$ is a characteristic function for some random variables. What is the direction should I approach this problem?

Thank you very much for the help.

2

There are 2 best solutions below

0
On

Considering that $\phi(0)=1$ and that $\phi_k(0)=1$ for every $k$, one sees that the condition $$\sum_kp_k=1$$ is necessary. To prove that is also sufficient, and to avoid most of the technicalities, consider some random variables $N$ and $(X_k)$ defined on the same probability space, $N$ independent of $(X_k)$, each $X_k$ with characteristic function $\phi_k$, and $N$ integer valued with distribution $P(N=k)=p_k$ for every $k$. Then the random variable $$X_N=\sum_kX_k\mathbf 1_{N=k},$$ has characteristic function $$\phi=\sum_kp_k\phi_k.$$ In particular, $\phi$ is a characteristic function.

0
On

I thought another way of prooving that. (Certainly, I misunderstood something in the last line of the previous message: I don't see why $\phi$ is the characteristic function we were looking for...)

We have $\phi_1(t),\dots,\phi_n(t)$ characteristic function, and a set of $\{p_1,\dots,p_n\}$ positive real numbers, with the condition that $\sum_{k=1}^n p_k = 1$ We want to prove that $\sum_{k=1}^n p_k \phi_k(t)$ is a characteristic function.

An observation: I take $n$ finite, because I will state something about the -convex- linear combination of them, and I am not sure whether this idea can be generalized to a set of infinite functions or not, and also because, later, we will interchange derivative and sum.

So, as you said, checking that a function verify those conditions you have stated is not enough to say that the function is a characteristic function. We could use the inverse theorem, or instead, we can encounter a random variable whose characteristic function is that one we have.

Well, we have $F_1,\dots, F_n$ distribution functions of some $X_1,\dots, X_n$ random variables. If we take a convex linear combination of them we have another distribution function (it's easy to check it). So, we define $F:=\sum_{k=1}^n p_k F_k$, and $F$ is the distribution function of a random variable $X$.

We will calculate $\phi(t)=\mathbb{E}(e^{itX})$ (using the Riemann-Stieljes integral). $\phi(t)=\mathbb{E}(e^{itX})=\int e^{itx} dF(x)=\int e^{itx} d\left[\sum_k p_kF_k(x)\right]=\int e^{itx} \sum_k p_k d[F_k(x)]= \\ = \sum_k p_k \int e^{itx} d[F_k(x)]=\sum_k p_k \mathbb{E}(e^{itX_k})=\sum_k p_k \phi_k(t)$

We justify the equalities saying that as we are taking a finite sum, we can interchange sum and derivative, and sum and integral.

I expect it would be useful for you. Sorry for my English.