Let $X_0$, $X_1$ and $X_2$ be three mutually independent random variables. We define two more random variables $D_1$ and $D_2$ as follows: $$D_1 = X_1 + X_0\\[1ex] D_2 = X_2 + X_0$$ We're interested in arguing (in the most effective and simple way) if
- $D_1$ and $D_2$ are independent
- $D_1|X_0=x_0$ and $D_2|X_0=x_0$ are independent
If $D_1$ and $D_2$ are independent, it must be $\text{Cov}(D_1,D_2)=0$:
$$\text{Cov}(D_1, D_2)=\mathbb{E}[D_1 D_2]-\mathbb{E}[D_1]\mathbb{E}[D_2] =\\ =\mathbb{E}\left[ (X_1+X_0)(X_2+X_0) \right] - \mathbb{E}\left[ X_1+X_0 \right]\mathbb{E}\left[ X_2+X_0 \right]=\\ =\mathbb{E}\left[ X_1 X_2 + X_1 X_0 + X_0 X_2 + X_0^2 \right] - \mathbb{E}\left[ X_1 \right]\mathbb{E}\left[ X_2 \right] - \mathbb{E}\left[ X_1 \right]\mathbb{E}\left[ X_0 \right] - \mathbb{E}\left[ X_0 \right]\mathbb{E}\left[ X_2 \right] - \left(\mathbb{E}\left[ X_0 \right]\right)^2=\\ =\mathbb{E}\left[ X_1 \right]\mathbb{E}\left[ X_2 \right] + \mathbb{E}\left[ X_1 \right]\mathbb{E}\left[ X_0 \right] + \mathbb{E}\left[ X_0 \right]\mathbb{E}\left[ X_2 \right] + \mathbb{E}\left[ X_0^2 \right] - \mathbb{E}\left[ X_1 \right]\mathbb{E}\left[ X_2 \right] - \mathbb{E}\left[ X_1 \right]\mathbb{E}\left[ X_0 \right] - \mathbb{E}\left[ X_0 \right]\mathbb{E}\left[ X_2 \right] - \left(\mathbb{E}\left[ X_0 \right]\right)^2=\\ =\mathbb{E}\left[ X_0^2 \right] - \left(\mathbb{E}\left[ X_0 \right]\right)^2 = \text{Var}(X_0)\neq 0$$
So, if not for the trivial case in which $X_0 = \text{constant}$, $\mathbf{D_1}$ and $\mathbf{D_2}$ are not independent.
2. One could do the same thing of point 1. all over again or just notice that this case is exactly the trivial one in which $X_0 = x_0$, a fixed "constant". So, $\mathbf{D_1|X_0 =x_0}$ and $\mathbf{D_2|X_0 =x_0}$ are independent.
MY QUESTION
Are my proofs correct? Is there a simpler way to prove the same thing or even just argument the same results in a better, more intuitive way?
A BONUS QUESTION
How can I test these theoretical results with the following example? (In order to really "see" what I found)
$X_0$ is the outcome of the fair coin "COIN": $X_0=1$ if Heads, $X_0=0$ if Tails.
$X_1$ is the outcome of the 6-faced fair dice "DICE1": $X_1=\{1,2,3,4,5,6\}$
$X_2$ is the outcome of the 6-faced fair dice "DICE2": $X_2=\{1,2,3,4,5,6\}$
SOMETHING THAT PERPLEXES ME
Intuitively and "constructively", I would say that the event to observe $D_1=k$ is independent from that of observing $D_2 = h$, infact the first is the event of obtaining $X_1 + X_0 = k$ with a throw of DICE1 and a toss of COIN while the second is the event of obtaining $X_2 + X_0 = k$ with a throw of DICE2 and a different toss of COIN: these two "procedures" are totally independent from each other and so should be their probabilities (?)
But this contradicts my analytical results, doesn't it! Why is that?
Simulation with your coins and dice: With a million replications of the experiment, results should be accurate to two places.
Matches your result that $Cov(D_1,D_2) = Var(X_0).$
Matches your result that the covariance of the conditional variables $D_1 | X_0=1$ and $D_2 | X_0=1$ is $0.$
If we simulate only 5000 values of each and uniformly jitter them (to avoid over-plotting of the discrete points), we can visualize the nature of the association between $D_1$ and $D_2$ in your example with coins and dice.
R code for figure: