Conditional expectation with indicator function

607 Views Asked by At

I'm trying to get familiar with indicator functions.

Let $Y$ and $X$ be two independent dices and $\Omega=\{1,2,3,4,5,6\}$ and $X\sim Y \sim Uni(\Omega)$ and $Z$ is the sum $Z=Y+X$. Now I want to compute the conditional expectation $E(Z\mid X)$ using indicator functions.

So my attempt:

We have $E(Z\mid X)(\omega):=\sum\limits_{i=1}^6 \operatorname{E}(Z\mid X=x_i)\mathrm1_{X=x_i}$

So it follows $$\operatorname{E}(Z\mid X=x)=\frac{\operatorname{E}(Z\mathrm1_{X=x})}{P(X=x)}=\frac{\sum\limits_{\Omega' \cap \{X=x\}}zP(Z=z)}{P(X=x)}$$

$P(X=x)=1/6$ and if I set $\Omega'_0:={\Omega' \cap \{X=x\}}$ then it becomes

$$\operatorname{E}(Z\mid X=x)=6\sum\limits_{\Omega'_0}zP(Z=z)$$

Now I know that the RHS makes a new conditional measure. But how can I compute further?

2

There are 2 best solutions below

2
On BEST ANSWER

The probability of Z = z is as follows:

0, z = 2 .. x

$\frac{1}{6}$, z = x+1 .. x+6

0, z = x+7 .. 12

$\sum{z \cdot P(z|x)}=(x+1+x+2+..x+6)\cdot\frac{1}{6}=(6x+21)\cdot\frac{1}{6}=x+3.5$

This should stand to reason as E(y) = 3.5

of course this assumes that X and Y are independent

0
On

There is a smooth solution for this that leaves out indicator functions. So it is not really an answer to your question, but it might interest you anyway:

$$\mathbb E(Z|X)=\mathbb E(Y+X\mid X)=\mathbb E(Y|X)+\mathbb E(X\mid X)=\mathbb E(Y|X)+X$$

If moreover $X$ and $Y$ are independent then $\mathbb E(Y|X)=\mathbb EY$ so we end up with:$$\mathbb E(Z|X)=\mathbb EY+X=3.5+X$$