I started (again!) with the intention to build an interesting example of a computation of conditional expectation with respect to $\sigma(X) $ when $X $ is not a step function. My first example, choosing $\Omega = [-1, 1], $ ${\cal F} = {\cal B}(\Omega), $ and $P = (1/2)\lambda $ with
$X: \Omega \mapsto R: \omega \mapsto \omega^2 $ or $|\omega | $ worked fine, since $\sigma(X) = \{A \in {\cal B}(\Omega): A= -A\}. $
When I tried to have a function that is not symmetric, say
$Y(\omega) = -\omega\cdot I_{[-1, 0]}(\omega) + 2\omega \cdot I_{[0, 1]}(\omega), $ not such luck. Now, I can see the overall structure of $\sigma(Y), $ but cannot really write it down in an elegant manner: I see that intervals like $(a, b) $ with $a= -1/2*b $ should be part of $\sigma(Y), $ if $b > 0, $ but I was hoping to get an elegant expression like in the case of $\sigma(X) $ above. Is that possible?
Thank you.
Maurice
First, for any $B\in\mathcal{B}(\mathbb{R})$, $$ X^{-1}(B)=X^{-1}(B\cap[0,1])\cup X^{-1}(B\cap(1,2])\equiv C_1\cup C_2. $$ Notice that $C_1\cap C_2=\emptyset$, $C_2=\emptyset$ or $C_2\subseteq(1/2,1]$ and $C_1$ is "symmetric" around $0$ in the following sense: if $C_1^+\equiv C_1\cap [0,1/2]$ and $C_1^-\equiv C_1\cap [-1,0)$, then $C_1^-=-2C_1^+$.
For an integrable random variable $Y$, the conditional expectation $\mathsf{E}[Y\mid \sigma(X)]=Z(\omega)$, where \begin{align} Z(\omega)&:=Y(\omega)1_{(1/2,1]}(\omega)+\frac{Y(\omega)+Y(-2\omega)}{2}1_{[0,1/2]}(\omega)\\ &\quad+\frac{Y(\omega)+Y(-\omega/2)}{2}1_{[-1,0)}(\omega). \end{align}