By definition of $\mathbb E(X\mid \sigma(Y))$ calculate $\mathbb E(X\mid Y=y)$ when $X$and $Y$ are discrete random variables.

87 Views Asked by At

By definition of $\mathbb E(X\mid \sigma(Y))$ i want to show $$\mathbb E(X\mid Y=y)=\sum x \mathbb P(X=x\mid Y=y)$$ when $X$and $Y$ are jointly discrete random variables.( Absolutely continues case is here ). I want to know is my steps right or no. I need some explanation in the steps from start to end(declared with sign ?)

let $Y$ is a discrete random variable that take values $\{ a_1 ,a_2,\cdots ,a_n\}$. So $\sigma(Y)=\sigma(\{a_1\},\cdots , \{ a_n\})$ (??)

By definition $\forall A\in \sigma(Y)$

$$ \mathbb E \left( \mathbb E \color{red}{(}X\mid \sigma(Y)\color{red}{)}1_A\right) =\mathbb E(X1_A)$$ since $A\in \sigma(Y)$ so $1_A$ is a function of $Y$ so i think i can write $$\mathbb E \left( \mathbb E \color{red}{(}X|\sigma(Y)\color{red}{)}1_B(Y)\right)=\mathbb E(X1_B(Y))$$

$$RHS=\mathbb E(X1_B(Y))=\sum_{y\in B} \sum_{x} x \mathbb P(X=x,Y=y) $$

$$LHS=\mathbb E \left( \mathbb E \color{red}{(}X\mid \sigma(Y)\color{red}{)}1_B(Y)\right) =E \left( \mathbb E \color{red}{(}X\mid Y\color{red}{)}1_B(Y)\right) $$

$$=\mathbb E \left( g(Y) 1_B(Y) \right)=\sum_{y\in B} g(y) P(Y=y) =\sum_{y\in B} \mathbb E(X\mid Y=y) P(Y=y) $$ By unifying $LHS$ and $RHS$ , $\forall B$

$$\sum_{y\in B} \mathbb E(X\mid Y=y) \mathbb P(Y=y)=\sum_{y\in B} \sum_{x} x \mathbb P(X=x,Y=y)$$

I think(??) for $y\in \{ a_1 ,a_2,\cdots ,a_n\}$ i can write (since equation is for all $B$ ??)

$$ \mathbb E(X\mid Y=y) \mathbb P(Y=y)= \sum_{x} x \mathbb P(X=x,Y=y)$$

so for $y\in \{ a_1 ,a_2,\cdots ,a_n\}$

$$ \mathbb E(X\mid Y=y) = \sum_{x} x \frac{ \mathbb P(X=x,Y=y)}{ \mathbb P(Y=y)}= \sum_{x} x \mathbb P(X=x\mid Y=y)$$.

This proof was for finite support(like $Y$ is binomial), is this valid for countable support ?(like Poisson?).

Thanks in advance for any help you are able to provide or any clarification.

1

There are 1 best solutions below

0
On BEST ANSWER

Again, this is essentially correct, but now you are overcomplicating things: the discrete case can be handled much easier.

Namely, for any $y$ such that $\mathbb P(Y=y)>0$, $$ \mathbb E[ \mathbb E[ X\mid Y]1_{Y=y}] = \mathbb E[ X 1_{Y=y}]\\ = \mathbb E\Bigl[\sum_x x1_{X=x}1_{Y=y}\Bigr] = \sum_x x\mathbb E[ 1_{X=x,Y=y}] = \sum_x x\mathbb P(X=x,Y=y). $$ On the other hand, $$ \mathbb E[ \mathbb E[ X\mid Y]1_{Y=y}] = \mathbb E[ X\mid Y=y]\mathbb E[ 1_{Y=y}] = \mathbb E[ X\mid Y=y]\mathbb P(Y=y). $$ Dividing these equalities by $\mathbb P(Y=y)$, we arrive at the statement.