Show that $\phi_{X_1,X_2}(t,t) = \phi_{X_1}(t) \phi_{X_2}(t)$ does not imply independence of $X_1$ and $X_2$

80 Views Asked by At

Let $X_1,X_2$ be random variables.

Let $\phi_{X_1,X_2} (t_1,t_2) $ be characteristic function of $(X_1,X_2)$

Let $\phi_{X_1}(t), \phi_{X_2}(t)$ be characteristic function of $X_1,X_2$ respectively.

I know that $X_1 , X_2 $ are independent if and only if $ \phi_{X_1,X_2}(t_1,t_2) =\phi_{X_1}(t_1)\phi_{X_2}(t_2).$ for all $t_1,t_2$.

But I don't think that $\phi_{X_1,X_2}(t,t)=\phi_{X_1}(t)\phi_{X_2}(t)$ implies that independence of $X_1,X_2$. Can you help me?

1

There are 1 best solutions below

2
On BEST ANSWER

Consider the particular case that $X_1 = X_2$; then identity

$$\phi_{X_1,X_2}(t,t) = \phi_{X_1}(t) \phi_{X_2}(t) \tag{1}$$

becomes

$$\phi_{X_1}(2t) = (\phi_{X_1}(t))^2. \tag{2}$$

If we can show that there exists a (non-trivial) distribution $\mu$ such that $X_1 \sim \mu$ satisfies $(2)$, then it is obvious that $(1)$ does not imply independence of $X_1$ and $X_2$ (because we can simply choose $X_1 = X_2 \sim \mu$ in $(1)$).

To get some intuition what kind of distribution could do the job, have a closer look at $(2)$. For brevity of notation, set $\phi := \phi_{X_1}$. Obviously, by $(2)$,

$$\phi(2) = \phi(1)^2$$

and, by iteration,

$$\phi(2k) = \phi(1)^{2k}$$

for any $k \in \mathbb{N}$, i.e. $\phi$ satisfies

$$\phi(t) = \phi(1)^t \qquad \text{for all $t=2k$, $k \in \mathbb{N}$.}$$

Suppose for the moment that $\mu$ is symmetric, then we have $\phi(t) = \phi(-t)$ for all $t \geq 0$, and therefore

$$\phi(t) = \phi(1)^{|t|} \qquad \text{for all $t =2k$, $k \in \mathbb{Z}$.} \tag{3} $$ This gives us some intuition how $\phi$ looks like; let us define

$$\phi(t) := c^{|t|}, \qquad t \in \mathbb{R},$$

for some constant $c>0$. It is obvious that $\phi$ satisfies $(2)$, and therefore we just have to choose $c>0$ in such a way that $\phi$ is the characteristic function of a (non-trivial) distribution $\mu$. Since any characteristic function is bounded, we have to choose $c \in (0,1]$. Moreover, because $\mu$ is non-trivial, we also have $c \neq 1$. For any $c \in (0,1)$ we can write $c=e^{-\gamma}$ for some $\gamma>0$, and so

$$\phi(t) = e^{-\gamma |t|}, \qquad t \in \mathbb{R}.$$

It is well-known that this function is a characteristic function of a symmetric (non-trivial) distribution; namely the Cauchy distribution with parameter $\gamma>0$.