Conditional density function between Gammas

35 Views Asked by At

Let $(X,Y)$ be a random vector with joint probability density $$f_{XY} (x,y) = xe^{-x} 1_{(0, \infty)}(x) 1_{(0,1)}(y)$$ and let $U=XY$ and $V=X(1-Y)$.

Give the distribution of $X$ given $U=u$ for $u>0$.

I found that the distributions for both $U$ and $V$ are $\Gamma(1,1)$ and the distribution for $X$ is $\Gamma(1,1)$. However, I do not know how to compute the conditional distribution of $X$ given $U=u$ as they are dependant variables...

Any help would be appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

A conditional distribution of $X$ given $U$ is a class of functions of the form $Q:\mathbb{R}^2\to \mathbb{R}$ such that for every $c\in \mathbb{R}$ we have that $$ P [X\leqslant c]=\int_{\mathbb{R}}Q(c,t)\,P_U(d t)\tag1 $$ where $P_U:=P\circ U^{-1}$ is the distribution of $U$. We represent this class of functions by expressions like $\Pr [X\leqslant c|U=t]$, it means that when we write $$ \int_{\mathbb{R}}P[X\leqslant c|U=t]\,P_U(d t)\tag2 $$ in reality the expression $P [X\leqslant c|U=t]$ represents any of the previously mentioned $Q$ functions. It can be shown that, when $U$ have a continuous distribution, for any chosen $c\in \mathbb{R}$ there are uncountable many different functions $t\mapsto Q(c,t)$, this is why asking for the value of $P[X\leqslant c|U=t]$ for some specific $c$ and $t$ doesn't make sense, as this value can be, literally, any one.

However we can find some of these $Q$ functions, that is, a representative of the conditional distribution of $X$ given $U$ using (1) in the case where $X$ and $U$ are absolutely continuous (that is, when they have densities) noticing that $$ P[X\leqslant c]=\int_{\mathbb{R}}\mathbf{1}_{(-\infty ,c]}(t) f_X(t) \,d t,\quad \int_{\mathbb{R}}Q(c,t)P_U(dt)=\int_{\mathbb{R}}Q(c,t)f_U(t)\,d t\tag3 $$ Equating both integrands we can choose $Q$ by $$ Q(c,t):=\begin{cases} \frac{f_X(t)}{f_U(t)}\mathbf{1}_{(-\infty ,c]}(t),&\text{ when }f_U(t)\neq 0\\ h(c,t),&\text{ otherwise } \end{cases}\tag4 $$ for any arbitrarily chosen function $h$. Its common to choose $h$ to be zero. Now using the densities of $X$ and $U$, and assuming that your computations are right, we find that $Q(c,t):=\mathbf{1}_{(0 ,c]}(t)$ is a version of the probability distribution function.