I'm trying to understand the solution to the following question:
Let X and Y be independent random variables, uniformly distributed on the interval $[0,1]$. Since $X \neq 0 $ almost surely, the random variable $Z=\frac{Y}{X}$ is well defined.
Compute $P(X < x | \sigma(Y)) $ and $ P(X < x | \sigma(Z)) $.
How do you calculate a conditional probability in the case where you are conditioning on a sigma algebra? How is the answer below obtained?
$$P(X < x | \sigma(Y)) = \min\{x,1\} $$ $$P(X < x | \sigma(Z)) = \min\{x^2,1\} I_{\{ Z \leq 1 \}} + \min\{xZ^2,1\}I_{\{ Z \geq 1\}} $$
Probability of an event B given a sigma algebra $\mathcal{F}$ is a random variable defined as $$P(B|\mathcal{F}) = E(I_B|\mathcal{F})$$ where $I_B$ is the indicator function.
In your case, we would have $$P(X<x|\sigma(Y))=E(I\{X<x\}|Y).$$
To get the conditional expectation, we use the following method (described in more detail here (A.2)):
We see how the expectation of $I\{X<x\}$ depends on $Y$ by calculating $$E(I\{X<x\}|Y=y)=P(X<x|Y=y)=P(X<x)=min \{x,1\}$$ for $x\geq0$. In the last step we omitted the condition, as $X$ and $Y$ are independent.
Now we should just replace $y$ with $Y$ in the last expression to get $E(I\{X<x\}|Y)$. However, there is no $y$ in the expression, because of independence, so we just have $E(I\{X<x\}|Y)=min \{x,1\}$
The second problem is quite harder. Similarly, you need to calculate $P(X<x|Z=z)$. To do that, I think you could use the Bayes' theorem for continuous random variables, as stated here, to get the conditional density $f_{X|Z=z}(x)$. Then, $P(X<x|Z=z)=\int_0^xf_{X|Z=z}(x)dx$ which should evaluate to $min\{x^2,1\}I_{\{z\leq1\}}+min\{xz^2,1\}I_{\{z>1\}}$. Finally, replacing $z$ with $Z$ you get the wanted solution.