Compute $P(X<x\mid Y)$ and $P(X<x\mid Z)$ for $(X,Y)$ i.i.d. uniform on $(0,1)$ and $Z=Y/X$

145 Views Asked by At

I'm trying to understand the solution to the following question:

Let X and Y be independent random variables, uniformly distributed on the interval $[0,1]$. Since $X \neq 0 $ almost surely, the random variable $Z=\frac{Y}{X}$ is well defined.

Compute $P(X < x | \sigma(Y)) $ and $ P(X < x | \sigma(Z)) $.

How do you calculate a conditional probability in the case where you are conditioning on a sigma algebra? How is the answer below obtained?

$$P(X < x | \sigma(Y)) = \min\{x,1\} $$ $$P(X < x | \sigma(Z)) = \min\{x^2,1\} I_{\{ Z \leq 1 \}} + \min\{xZ^2,1\}I_{\{ Z \geq 1\}} $$

3

There are 3 best solutions below

3
On BEST ANSWER

Probability of an event B given a sigma algebra $\mathcal{F}$ is a random variable defined as $$P(B|\mathcal{F}) = E(I_B|\mathcal{F})$$ where $I_B$ is the indicator function.

In your case, we would have $$P(X<x|\sigma(Y))=E(I\{X<x\}|Y).$$

To get the conditional expectation, we use the following method (described in more detail here (A.2)):

We see how the expectation of $I\{X<x\}$ depends on $Y$ by calculating $$E(I\{X<x\}|Y=y)=P(X<x|Y=y)=P(X<x)=min \{x,1\}$$ for $x\geq0$. In the last step we omitted the condition, as $X$ and $Y$ are independent.

Now we should just replace $y$ with $Y$ in the last expression to get $E(I\{X<x\}|Y)$. However, there is no $y$ in the expression, because of independence, so we just have $E(I\{X<x\}|Y)=min \{x,1\}$

The second problem is quite harder. Similarly, you need to calculate $P(X<x|Z=z)$. To do that, I think you could use the Bayes' theorem for continuous random variables, as stated here, to get the conditional density $f_{X|Z=z}(x)$. Then, $P(X<x|Z=z)=\int_0^xf_{X|Z=z}(x)dx$ which should evaluate to $min\{x^2,1\}I_{\{z\leq1\}}+min\{xz^2,1\}I_{\{z>1\}}$. Finally, replacing $z$ with $Z$ you get the wanted solution.

1
On

Conditioning on the sigma algebra $\mathcal{G}$, a conditional probability is defined to be a $\mathcal{G}$ measurable function satisfying \begin{align} E(P(A \mid \mathcal{G}) \mathbb{1}_{G}) = P(A \cap G) \end{align} for any $G \in \mathcal{G}$.

Probabilities conditioned on sigma algebras are a bit tricky in the sense that you don't directly calculate them. Rather, you guess a $\mathcal{G}$ measurable function and verify that it satisfies the above condition. For example,

Let $G \in \sigma(Y)$. Then \begin{align*} E(\min(x,1) \mathbb{1}_G) &= E(\min(x,1)) \cdot E(\mathbb{1}_G) \\ &= (\mathbb{1}_{\{x \leq 1\}}x + \mathbb{1}_{\{x > 1\}}) \cdot P(G) \\ &= P(X < x) \cdot P(G) \\ &= P(\{X < x\} \cap G) \end{align*} where the first and last equalities follow by independence. Thus $P(X < x \mid \sigma(Y)) = \min(x,1)$.

Similarly for the second conditional probability, you need only to show that \begin{align*} E\Big((\min(x^2,1)\mathbb{1}_{\{Z\leq 1\}}+\min(x\cdot Z^2,1)\mathbb{1}_{\{Z\geq 1\}}) \mathbb{1}_G\Big) = P(\{X < x\} \cap G) \end{align*} for any $G \in \sigma(Z)$.

1
On

The first identity is direct since $(X,Y)$ is independent hence $P(X<x\mid Y)=P(X<x)$ almost surely, end of story.

To show the second identity, since every distribution involved, conditional or not, has a PDF, a rather straightforward method is to compute the conditional PDF $f_{X\mid Z}$. This requires to know the joint PDF $f_{X,Z}$ and the marginal PDF $f_Z$, then $$f_{X\mid Z}(x\mid z)=\frac{f_{X,Z}(x,z)}{f_Z(z)}$$ and, by definition, $P(X<x\mid Z)=g_x(Z)$ where, for every $z$, $$g_x(z)=\int_0^xf_{X\mid Z}(\xi\mid z)d\xi$$ Sooo... to compute $f_{X,Z}$, we apply the classical Jacobian approach to the change of variable $(x,y)\to(x,z)=(x,y/x)$, which is such that $dxdy=xdxdz$ on the $(x,y)$-domain $0\leqslant x,y\leqslant1$, which is the $(x,z)$-domain $0\leqslant x\leqslant1$, $0\leqslant z\leqslant1/x$, hence $$f_{X,Z}(x,z)=x\mathbf 1_{0\leqslant x\leqslant1}\mathbf 1_{0\leqslant z\leqslant1/x}=x\mathbf 1_{z\geqslant0}\mathbf 1_{0\leqslant x\leqslant\min\{1,1/z\}}$$ Thus, $$f_Z(z)=\int_\mathbb Rf_{X,Z}(x,z)=\mathbf 1_{z\geqslant0}\int_0^{\min\{1,1/z\}}xdx=\tfrac12\min\{1,1/z\}^2\mathbf 1_{z\geqslant0}$$ and, for $x\geqslant0$, $$\int_0^xf_{X,Z}(\xi,z)d\xi=\mathbf 1_{z\geqslant0}\int_0^{\min\{1,1/z,x\}}\xi d\xi=\tfrac12\min\{1,x,1/z\}^2\mathbf 1_{z\geqslant0}$$ Dividing those two yields, for $z\geqslant0$, $$g_x(z)=\frac{\min\{1,x,1/z\}^2}{\min\{1,1/z\}^2}$$ hence, for every $x\geqslant0$, $$P(X<x\mid Z)=\frac{\min\{1,x,1/Z\}^2}{\min\{1,1/Z\}^2}=\frac{\min\{1,xZ,Z\}^2}{\min\{1,Z\}^2}$$ which is equivalent to the identity in your question.