Find version of conditional expectation

898 Views Asked by At

I'm struggling with the concept of conditional expectation. We didn't cover it in my probability theory class, yet it's required for my statistics course. I'm basically having no idea how to solve any of our homework problems where conditional expectation is involved which really bothers me - the other problems are usually not that hard, but I'm currently losing it as soon as conditional expectation is involved...

I'm currently working this problem, quite unsuccessfully though:

Let $X$ be an integrable, real valued random variable and $f$ its density with respect to the Lebesgue measure. Let $Y=g(X)$ where $g: \mathbb{R} \rightarrow \mathbb{R}$ has a positive derivative on $(0,\infty)$ and is symmetrical around $0$, i.e. $g(-x)=g(x)$. Explicitly give a function $\psi$ (depending on $f$ and $g$) such that $\psi(X)$ is a version of the conditional expectation $\mathbb{E}[X\mid Y]$.

I have the definition of the conditional expectation given like here.

This means, I have to choose $\psi$ such that it satisfies these two conditions:

a) $\psi(X)$ is $\sigma(Y)$-measurable.

b) $\mathbb{E}[X1_B]=\mathbb{E}[\psi(X)1_B]$, i.e. $\int_{B} X d\mathbb{P}=\int_{B} \psi(X) d\mathbb{P}$ for each $B \in \mathfrak{B}$.

The density property of $f$ means $\mathbb{P}(X\in A)=\int_{A}f d\lambda=\int_{X^{-1}(A)}1 d\mathbb{P}$ for each $A\in \mathfrak{B}$. But how does it relate to these conditions a) and b)? I can't figure it out, not even to speak of making use of the special form of $g$...

Can anyone help me to solve this problem?

I would also be thankful for links to similar problems and examples, as I know that conditional expectations will be crucial in statistics. So I would like to achieve a thorough understanding of the concept.

Thanks in advance for any help whatsoever!

1

There are 1 best solutions below

2
On BEST ANSWER

As explained in the comments, we solve the question when $g(x)=|x|$. Then $E[X\mid Y]=h(Y)$ for some function $h$ to be found, such that, for every bounded function $u$, $E[Xu(Y)]=E[h(Y)u(Y)]$, that is, $E[Xu(|X|)]=E[h(|X|)u(|X|)]$.

The distribution of $|X|$ has density $\bar f:y\mapsto f(y)+f(-y)$ on $y\geqslant0$ hence one asks that $$ \int_\mathbb R xu(|x|)f(x)\mathrm dx=\int_0^\infty h(y)u(y)\bar f(y)\mathrm dy. $$ Note that the LHS is also $$ \int_0^\infty y(f(y)-f(-y))u(y)\mathrm dy. $$ By identification, this proves that, for every $y\geqslant0$, $$ h(y)=y\frac{f(y)-f(-y)}{f(y)+f(-y)}, $$ hence $$ E[X\mid |X|]=|X|\frac{f(|X|)-f(-|X|)}{f(|X|)+f(-|X|)}. $$ Edit: For a thorough understanding of the concept of conditional expectation, I recommend the small and excellent book Probability with martingales by David Williams.