I have a bit of a dilemma. Let $X\sim U(0,1)$, and for two unknown constants: $0<a<b<1$ we define:
$$Y = \left\{\begin{matrix} 1 & 0<x<b\\ 0& otherwise \end{matrix}\right. \ \ = \ \ \ I(0<x<b)$$
$$Z = \left\{\begin{matrix} 1 & a<x<1\\ 0& otherwise \end{matrix}\right. \ \ = \ \ \ I(a<x<1)$$
We would like to answer know if $Y,Z$ are independent.
Intuitively: Suppose $Y=1$ then we can't tell if $x \in (0,a)$ or $x \in (a,b)$, but if $Y=0$ then for sure $x \in (b,1)$ and hence, $x>a \to Z=1$.
I might want to argue that $Y|X=x$ is independent of $Z|X=x$ since when we know $x$ both $Z$ and $Y$ are deterministic, but when I try to calculate their covariance I don't get zero, which is indication of independence.
From Law of total Expectancy:
$$\mathbb{E}[YZ]=\mathbb{E}[\mathbb{E}[YZ|X]]= 0+1\times\mathbb{P}((a<x<1)\cap(0<x<b))=\mathbb{P}(a<x<b)=b-a$$ otherwise either $Y$ or $Z$ equals $0$ and we get zero.
- $$\mathbb{E}[Y] =\mathbb{E}[\mathbb{E}[Y|X]]= \mathbb{E}[I(0<x<b)] = 1\times\mathbb{P}(0<x<b)=b$$
- $$\mathbb{E}[Z] =\mathbb{E}[\mathbb{E}[Z|X]]= \mathbb{E}[I(a<x<1)] = 1\times\mathbb{P}(I(a<x<1))=1-a$$
Therefore I get:
$$Cov(Y,Z) = \mathbb{E}[YZ] - \mathbb{E}[Z]\mathbb{E}[Y] = b-a - b(1-a)= b-a - b +ba = ba-a = a(b-1) \neq0$$
From this result I conclude that there is a dependency, but a solution to this problem showed independence. I'm a bit confused. I'm a bit scared now that my way has some mistake along the way. I'd like some guidance.
This is more of a comment, but I do not have sufficient reputation. Here is an argument why they are not independent. It is sufficient to show that $\mathbb{P}\left(Y=1 \vert Z = 1\right) \neq \mathbb{P}\left(Y = 1 \vert Z = 0\right)$. Observe that the first probability is $$\mathbb{P}\left(X < b \vert X > a\right) = \frac{\mathbb{P}\left(a < X < b\right)}{\mathbb{P}\left(X > a\right)} = \frac{b-a}{1-a}$$ Likewise, the second probability is just $$\mathbb{P}\left(X < b \vert X \leq a\right) = 1$$ since $b < a$. Evidently, $b - a \neq 1 - a$ since $b < 1$.