I'm trying to get familiar with indicator functions.
Let $Y$ and $X$ be two independent dices and $\Omega=\{1,2,3,4,5,6\}$ and $X\sim Y \sim Uni(\Omega)$ and $Z$ is the sum $Z=Y+X$. Now I want to compute the conditional expectation $E(Z\mid X)$ using indicator functions.
So my attempt:
We have $E(Z\mid X)(\omega):=\sum\limits_{i=1}^6 \operatorname{E}(Z\mid X=x_i)\mathrm1_{X=x_i}$
So it follows $$\operatorname{E}(Z\mid X=x)=\frac{\operatorname{E}(Z\mathrm1_{X=x})}{P(X=x)}=\frac{\sum\limits_{\Omega' \cap \{X=x\}}zP(Z=z)}{P(X=x)}$$
$P(X=x)=1/6$ and if I set $\Omega'_0:={\Omega' \cap \{X=x\}}$ then it becomes
$$\operatorname{E}(Z\mid X=x)=6\sum\limits_{\Omega'_0}zP(Z=z)$$
Now I know that the RHS makes a new conditional measure. But how can I compute further?
The probability of Z = z is as follows:
0, z = 2 .. x
$\frac{1}{6}$, z = x+1 .. x+6
0, z = x+7 .. 12
$\sum{z \cdot P(z|x)}=(x+1+x+2+..x+6)\cdot\frac{1}{6}=(6x+21)\cdot\frac{1}{6}=x+3.5$
This should stand to reason as E(y) = 3.5
of course this assumes that X and Y are independent