Conditioning on other random variables is something I don't have a great grasp of. I came across example 3.2 in 'Random Processes for Engineers' by professor hajek and this concept seems to be what prevents me from finishing off the example. In the problem we have Y=XU where X and U are independent random variables, U~Unif[0,1] and X has the Raleigh distribution $fx(x) = \frac{x}{\sigma^2}e^\frac{-x^2}{{2}{\sigma^2}}$ for x>=0 else 0. Part of this problem involves finding E[Y|X] where first step is finding the joint distribution $fxy = f_X(x)*f_{Y|X}(y|x)$. In this type of problem I don't understand how we find the conditional density. I believe we say that $P(Y\le y)$ becomes $P(XU\le y)$ which is $P(U\le \frac{y}{X}|X=x)$ but how do we get from this step to the final step of resolving $f_{Y|X}(y|x)$?
2026-05-05 15:41:30.1777995690
Conditioning in total law of probability
52 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
The main step is to find the joint density $f_{XY}(x,y)$. This can be done using the jacobian method
To do that, set
$$\begin{cases} y=xu\\ z=x \end{cases}\rightarrow\begin{cases} x=z\\ u=y/z \end{cases} $$
the jacobian is $|J|=1/z$ thus
$$f_{YZ}(y,z)=\frac{1}{\sigma^2}e^{-z^2/(2\sigma^2)}\cdot\mathbb{1}_{0<y<z<\infty}$$
Thus by definition
$$f_{Y|X}(y|x)=\frac{1}{x} \cdot\mathbb{1}_{0<y<x}$$
That is a uniform density with expectation
$$\mathbb{E}[Y|X=x]=x/2$$