Conditional expectation derivation of proof

98 Views Asked by At

very basic question on the derivation of the conditional expectation formula for a continuous random variable. I get the intuition for the formula, but not one crucial step.

starting from $E[X|Y] = \int x f_{X|Y}(x|y) dx = \int x \frac {f_{Y|X}(y|x)f_X(x)}{f_{Y}(y)} dx $

I know that we are suppose to end up with $E[X|Y] = \int x \frac {f_X(x)}{f_{Y}(y)} dx = \frac {E[X]}{P(Y)}$

I am wondering why $f_{Y|X}(y|x) = 1$? is it because as we condition on the whole range of X, then it just collapses to 1?

to give a specific example:

why is it that: for $\theta \sim U[-1,1]$, and $a\in [-1,1]$

$E[\theta | \theta > a] = \frac{\int_{a}^1 \theta f(\theta)d\theta}{1-F(a)} $

?

1

There are 1 best solutions below

1
On

I know that we are suppose to end up with ...

That's where you are going wrong. You aren't supposed to end with that. You are missing an indicator random variable, and this only holds when $Y$ is discrete. That is that the event $Y=y$ has non-zero probability mass.

$$\begin{align}\mathsf E(X\mid Y=y) =& \int_\Bbb R x~f_{X\mid Y}(x\mid y)~\mathsf d x \\[1ex] = & \int_\Bbb R \frac{x~f_{X,Y}(x,y)}{f_Y(y)}~\mathsf d x\\[2ex] =& \dfrac{\int_\Bbb R x~f_{X,Y}(x,y)~\mathsf d x}{\int_\Bbb R f_{X,Y}(y)~\mathsf d x}\\[2ex] =& \dfrac{\sum_{k}\int_\Bbb R x\mathbf 1_{k=y}~f_{X,Y}(x,k)~\mathsf d x}{\sum_{k}\int_\Bbb R \mathbf 1_{k=y}~f_{X,Y}(x,k)~\mathsf d x} &\text{if $Y$ is a discrete random variable} \\[2ex] =& \dfrac{\mathsf E(X~\mathbf 1_{Y=y})}{\mathsf P(Y=y)} \end{align}$$

If $Y$ is continuous, then you should stop at line 1 or 2.


So likewise:

If the event $\theta>a$ has non-zero probability mass, (and $a$ is constant, $\theta$ a random variable) then: $$\mathsf E(\theta\mid\theta>a) = \dfrac{\mathsf E(\theta~\mathbf 1_{\theta>a})}{\mathsf P(\theta>a)}$$