A proof or reference for $\phi_{(X_1, \dots, X_k)} = \phi_{X_1} \dots, \phi_{X_k}$ implies $X_1, \dots, X_k$ independent.

82 Views Asked by At

Let $(X_1, \dots, X_k)$ be a random vector. Assume that we have the following decomposition of characteristic functions:

$$\phi_{(X_1, \dots, X_k)} = \phi_{X_1} \dots \phi_{X_k}$$

Then is it true that $X_1, \dots, X_k$ is independent? The converse is obvious, but I'd like a reference or a proof for the implication I stated. You can assume that I'm familiar with basic weak convergence/characteristic function theory.

1

There are 1 best solutions below

0
On BEST ANSWER

The characteristic function of a vector-valued random variable determines its probability distribution. $\phi_{X_1}(t_1) \ldots \phi_{X_n}(t_n) $ is the characteristic function of $(Y_1, \ldots, Y_n)$ where $Y_1, \ldots, Y_n$ are independent random variables with each $Y_i$ having the same distribution as the corresponding $X_i$. Therefore if $\phi_{(X_1,\ldots,X_n)}(t_1, \ldots, t_n) = \phi_{X_1}(t_1) \ldots \phi_{X_n}(t_n)$, $(X_1, \ldots, X_n)$ has the same distribution as $(Y_1, \ldots, Y_n)$ and in particular are independent.