Good day,
Currently I am working with "Probability: Theory and Examples" by Durrett and while getting familiar with conditional expectations I got to this problem:
Consider the Lebesgue probability space on the interval $[0,1)$. (I.e. the state space is $Ω = [0, 1)$, the $\sigma$-field is the set of Lebesgue measurable sets and the measure is the Lebesgue measure.) We define the random variable $X$ as: $$X(w)=\begin{cases} 2w &, 0\leq w < 1/2 \\ 2w−1 &, 1/2\leq w<1 \end{cases}$$ Compute the conditional expectation $E(Y |X)$ where $Y : [0, 1) \to \mathbb{R}$ is a measurable function.
First off $Y$ is not defined further. I am a bit confused about the term "measurable function". Of what? A measurable function of $X$? But then it would just be $E(Y|X)=Y$. So I assume $Y$ to be a random variable not necessarily independent of $X$.
Second let's define conditional expectation: $E(Y|X):=E(Y|\sigma(X))$ is a random variable $Z$ such that $Z$ is measurable w.r.t. $\sigma(X)$ and $E(1_A Y)=E(1_A Z)$ for all $A \in \sigma(X)$.
Okay, let's pick a $A \in \sigma(X)$ then there exists a $B \in \mathcal{B}(\mathbb{\mathbb{R}})$ such that $A=X^{-1}(B)$. As Graham Kemp helpfully hinted, the correct inverse of $X$ is
$$X^{-1}\{x\}~=~\{w\in[0;1):x=X(w)\}~=~\begin{cases}\{x/2, (1+x)/2\}&:& 0\leq x<1\\ \{\}&:&\text{otherwise}\end{cases}$$
Now I am not sure to do. Normally I would begin from $$E(1_A Y)= \int_A Y dP=\int_B y P_Y(dy)$$
but now I am at my end. Can someone take it from here and show me what to do? I am thankful for every help/hint.
For every value of $x$ in $[0;1)$, the equation $x=X(w)$ has two solutions for $w$.
$$X^{-1}\{x\}~=~\{w\in[0;1):x=X(w)\}~=~\begin{cases}\{x/2, (1+x)/2\}&:& 0\leq x<1\\ \{\}&:&\text{otherwise}\end{cases}$$
Since the probability space is $\cal Lebesgue$ ...