Uniform Distribution Problem on $X, Y, Z$

711 Views Asked by At

Problem:

Let $X \sim \text{Uniform}(0,1)$. Let $0 < a < b < 1$. Let

$$ Y = \begin{cases} 1 & 0 < X < b \\ 0 & \text{otherwise} \end{cases} $$

and let

$$ Z = \begin{cases} 1 & a < X < 1 \\ 0 & \text{otherwise} \end{cases} $$

Question 1: How does one determine if $Y$ and $Z$ are independent?

I can see that $Y$ and $Z$ are independent for all $x \notin [0,1]$. Things are less clear otherwise.

Question 2: How does one compute $\mathbb{E}(Y \mid Z = z)$?

Of course, I know that if $Z$, $Y$ are independent then $\mathbb{E}(Y \mid Z = z) = \mathbb{E}(Y)$

2

There are 2 best solutions below

1
On

It's obvious that $Y$ and $Z$ cannot be independent. If you observe $Y = 1$, then you know $X \in (0,b)$, and intuitively, this information affects the probability of observing $Z = 1$. Formally, $$\Pr[Z = 1 \mid Y = 1] = \frac{\Pr[(Z = 1) \cap (Y = 1)]}{\Pr[Y = 1]} = \frac{b-a}{b},$$ but $$\Pr[Z = 1] = 1-a \ne 1 - \frac{a}{b}$$ if $b \ne 1$.

0
On

Define events A and B s.t.

$$Y = 1_{(X < b)} := 1_{B}$$

$$Z = 1_{(a < X)} := 1_{A}$$

Observe that Y and Z are independent iff A and B are independent.

$$P(A \cap B) = \int_a^b 1 dx = b-a$$

$$P(B) = \int_0^b 1 dx = b$$

$$P(A) = \int_a^1 1 dx = 1-a$$

So for $x \in [0,1]$,

A and B are independent iff

$$b-a=b(1-a) \iff b-a=b-ba \iff -a=-ba \iff a = ba \iff b=1 \ \text{or} \ a=0$$

Recall that events whose probabilities are 0 or 1 are independent of any other event, including itself.


Definition:

$$E[X|A] = \frac{E[X1_A]}{P(A)}$$

Thus, we have

$$E[Y | Z=z] = \frac{E[Y1_{Z=z}]}{P(Z=z)}$$

$$ = \frac{E[1_B1_{1_A=z}]}{P(1_A=z)}$$

Thus, $E[Y | Z=z]$ is undefined if $z \ne 1$

For $z=1$, we have

$$ = \frac{E[1_B1_{1_A=1}]}{P(1_A=1)}$$

$$ = \frac{E[1_B1_A]}{P(A)}$$

$$ = \frac{E[1_{A \cap B}]}{P(A)}$$

$$ = \frac{P(A \cap B)}{P(A)}$$

$$ = P(B|A)$$

Can you take it from here?