Independent with a vector v.s. independent with its components

221 Views Asked by At

Suppose that $Z$ (a scalar) is independent with $X=(X_1,\ldots,X_n)$, $n>1$. Then, $Z$ is independent with $X_1,\ldots,X_n$ because each of the latter is a function of $X$.

I suspect the reverse direction: $Z$ being independent with $X_1,\ldots,X_n$ implying $Z$ being independent with $X$ is not true. Most likely because we would need some info about the joint distribution of $X_1,\ldots,X_n$. But I can't think of a counter example. So could you please provide one as well as some intuition on how you arrive at it?

2

There are 2 best solutions below

0
On BEST ANSWER

Yes, in general the converse is not correct. The reason is, essentially, that pairwise independence does not imply independence.

Example Let $\Omega := \{0,1\}^2$ and $\mathbb{P}(\{\omega\}) := 1/4$, $\omega \in \Omega$. For $\omega = (\omega_1,\omega_2)$ define

$$X_1(\omega) := \omega_1 \qquad X_2(\omega) := \omega_2 \qquad Z(\omega) :=1_{\{\omega_1=\omega_2\}}.$$

Then it is not difficult to see that

$$ \mathbb{P}(X_1 = i, Z=j) = \frac{1}{4} = \mathbb{P}(X_1=i) \mathbb{P}(Z=j)$$

for any $i,j \in \{0,1\}$ which shows that $X_1$ and $Z$ are independent. In an analogous way we find that $X_2$ and $Z$ are independent. However, $Z$ and $X:=(X_1,X_2)$ are not independent since

$$\mathbb{P}(Z=1, X=(1,1)) = \frac{1}{4} \neq \frac{1}{8} = \mathbb{P}(Z=1) \mathbb{P}(X=(1,1)).$$

1
On

I is not only about the joint distribution of $(X_1, \dots, X_n)$. It is possible that all components of $X$, namely $(X_1, \dots, X_n)$ are independent and that all pairs $(Z,X_i)$ are independent, but $Z$ is not independent from $X$.

My favorite example to illustrate this is from Bauer's book Wahrscheinlichkeitstheorie (it's a german book): Suppose you throw a fair dice two times and let $Y_i$ denote the outcome of the $i$-th throw. We define: $$ X_1= \begin{cases}1, & \text{if} \ Y_1 \ \text{is odd} \\ 0, & \text{if} \ Y_1 \ \text{is even} \end{cases}, \quad X_2= \begin{cases}1, & \text{if} \ Y_2 \ \text{is odd} \\ 0, & \text{if} \ Y_2 \ \text{is even} \end{cases}, \quad Z= \begin{cases}1, & \text{if} \ Y_1+Y_2 \ \text{is odd} \\ 0, & \text{if} \ Y_1+Y_2 \ \text{is even} \end{cases}, $$ Clearly $X_1$ and $X_2$ are indpendent and it is also easily verified that $Z$ is independent from $X_1$ and that $Z$ is independent from $X_2$. However $Z$ is not independent from $X$.

This easily generalizes to $n \geq 2$: you throw a fair dice $n$ times and define $X_i$ as above and $Z$ is again defined via the sum of $Y_1, \dots, Y_n$. The intuition is that knowing the results of all $X_i$ except one, say $X_n$, the outcome of $Z$ and the outcome of $X_n$ contain the same information.