Conditional Probability Problem for a Joint Distribution

711 Views Asked by At

We have a joint probability distribution $f_{X,Y}(x,y)=\frac{1}{10}$, defined over the domain $(x,y)\in[-1,1]\times[-2,2]\cup[1,2]\times[-1,1]$. From this, I need to find the conditional PDFs $f_{X|Y}(x|y)$ and $f_{Y|X}(y|x)$.

I have figured out the marginal PDFs,

\begin{align*} f_X(x)&=\left\{\begin{array}{lr}\frac{2}{5},&-1\leq x\leq 1\\ \frac{1}{5},&1<x\leq2\end{array}\right.\\ \\ f_Y(y)&=\left\{\begin{array}{lr}\frac{2}{10},&-2\leq y<-1\\\frac{3}{10},&-1\leq y\leq 1\\ \frac{2}{10},&1<y\leq 2\end{array}\right. \end{align*}

However, am stuck on how to find the conditionals, due to the various domain constraints. I thought about splitting it up into two/three areas, and then finding $f_{X|Y\in a}(x)$, but I wasn't sure how to move from this partition into a single function for $X|Y$ and $Y|X$, and every other idea I think of won't normalise.

Note: I need to eventually find the variances of $X$ and $Y$ using the the expectation and variance of $Y|X$ and $X|Y$, I'm assuming with the law of total variance? If the functions stay split up, I'm assuming I just evaluate each section individually and then add them up?

2

There are 2 best solutions below

5
On

$\begin{align} f_{X,Y}(x,y) & =\begin{cases}1/10 & : (x,y)\in [-1,1]\times[-2,2]\cup[1,2]\times[-1,1] \\ 0 & : \text{elsewhere} \end{cases} \\[3ex] f_X(x) & =\begin{cases}\int_{-2}^2 1/10 \operatorname d y & : x\in [-1,1] \\ \int_{-1}^{1} 1/10 \operatorname d y & : x\in(1, 2] \\ 0 & : \text{elsewhere} \end{cases} \\[1ex] & =\begin{cases}2/5 & : x\in[-1,1] \\ 1/5 & : x\in (1,2] \\ 0 & : \text{elsewhere} \end{cases} \\[3ex] f_{Y\mid X}(y\mid x) & =\frac{f_{X,Y}(x,y)}{f_X(x)} \\[1ex] & =\begin{cases}1/4 & : x\in[-1,1], y\in[-2,2] \\ 1/2 & : x\in (1,2],y\in[-1,1] \\ 0 & : \text{elsewhere} \end{cases} \\[3ex] \mathsf E_{Y\mid X}[Y\mid X] & = \begin{cases} \int_{-2}^{2} y /4 \operatorname d y & : x\in[-1,1]\\\int_{-1}^{1} y /2\operatorname d y & : x\in(1,2]\\ 0 & : \text{elsewhere} \end{cases} \\[1ex] & = 0 \\[3ex] \mathsf E_{Y\mid X}[Y^2\mid X] & = \begin{cases} \int_{-2}^{2} y^2 /4 \operatorname d y & : x\in[-1,1]\\\int_{-1}^{1} y^2 /2\operatorname d y & : x\in(1,2]\\ 0 & : \text{elsewhere} \end{cases} \\[1ex] & = \begin{cases}4/3 & : x\in[-1,1] \\ 1/3 & : x\in (1,2]\\ 0 & : \text{elsewhere} \end{cases} \\[3ex] \mathsf {Var}[Y] & = \mathsf E_X[\mathsf E_{Y\mid X}[Y^2\mid X]]-\mathsf E_X[\mathsf E_{Y\mid X}[Y\mid X]]^2 \\[1ex] & = \int_{-1}^1 4/3\times 2/5 \operatorname d x + \int_{1}^2 1/3\times 1/5\operatorname d x \\[1ex] & = \frac{17}{15} \end{align}$

etc.


To verify that the conditional probability density functions ensure that they (1) are strictly increasing and (2) the integral over the conditional domain is equal to 1.

Our $f_{Y\mid X}$ does not decrease when progressing along the interval of $Y$ for any fixed value of $X$.  It is in fact constant for each possible value of $X$ (though a different constant for different values).

$\begin{align} f_{Y\mid X}(y\mid x) & =\begin{cases}1/4 & : x\in[-1,1], y\in[-2,2] \\ 1/2 & : x\in (1,2],y\in[-1,1] \\ 0 & : \text{elsewhere} \end{cases} \end{align}$

Next we integrate with respect to $y$ over the entire conditioned domain for all given values of $X$.   Thus verifying that the conditional measure is $1$ for all possible values of $X$ (and $0$ elsewhere).   That is, it's the indicator function for the domain of $X$; as it should be. (We have ensured that: $\mathsf P(Y\in \Omega_Y \mid X=x) = \operatorname{\bf 1}_{\Omega_X}(x)$)

$\begin{align}\int_{\Omega_Y\mid X=x} f_{Y\mid X}(y\mid x)\operatorname d y & = \begin{cases}\int_{-2}^2 1/4\operatorname d y & : x\in [-1,1] \\ \int_{-1}^1 1/2 \operatorname d y & : x\in (1,2]\\0 & :\text{elsewhere}\end{cases} \\[1ex] & = \begin{cases}1 & : x\in[-1,1]\\ 1 & : x\in(1,2]\\ 0 & : \text{elsewhere}\end{cases} \\[1ex] & = \begin{cases}1 & : x\in[-1,2]\\ 0 & : \text{elsewhere}\end{cases} \\[1ex] & = \operatorname{\bf 1}_{[-1,2]}(x)\end{align}$

Thus this conditional function is indeed a conditional probability density measure.

0
On

This is the kind of question that is easy to do (and almost trivially so in this case) if you draw a diagram and think geometrically, visualizing the joint density as an object (of volume $1$) sitting on the $x$-$y$ plane. In this instance, this solid is the union of two prisms with rectangular bases, a large rectangle (of area $8$) whose opposite corners are at $(-1,-2)$ and $(1,2)$ and a smaller rectangle (of area $2$) whose opposite corners are at $(1,-1)$ and $(2,1)$. In addition to this visualization of the joint density as a solid, the tools you need are the following.

  1. If $g(x)$ is a function that takes on nonnegative values only and $\int_{-\infty}^\infty g(x)\,\mathrm dx$ has finite value (call this value $A$; we call it the area under the curve), then $g(x)/A$ is a valid probability density function.

  2. If $f_{X,Y}(x,y)$ is a joint probability density function, then the value of the marginal density of $X$ at any (fixed) point $x_0$ is the area of the cross-section of the pdf solid that you see if the pdf solid is sliced by a vertical plane through the line $x=x_0$. That is, we get the area under the curve of the function $f_{X,Y}(x_0,y)$ which is a function of $y$ (remember that $x_0$ is just a fixed constant value).

  3. The conditional density of $Y$ given that $x = x_0$ is proportional to the function $f_{X,Y}(x_0, y)$ regarded as a function of $y$ alone. This function is nonnegative, obviously, and we can make it into a valid density by dividing by the "area under the curve" as described in Item 1. Item 2 assures us that the area under the curve is just $f_X(x_0)$ leading to the formula beloved by writers of textbooks on probability: $$f_{Y\mid X}(y \mid X=x_0) = \frac{f_{X,Y}(x_0, y)}{f_X(x_0)}.$$

All this is, of course, fine and dandy but blindly applying formulas is not needed in this simple case; just visualization!

  • For any fixed $x_0 \in [-1,1]$, the cross-section $f_{X,Y}(x_0, y)$ is a rectangle whose base extends from $y=-2$ to $y = +2$. Similarly, for any fixed $x_0 \in (1,2]$, the cross-section $f_{X,Y}(x_0, y)$ is a rectangle whose base extends fron $y=-1$ to $y=+1$. Hence,

For each $x_0 \in [-1,1]$, the conditional distribution of $Y$ given that $X = x_0$ is $U[-2,2]$.
For each $x_0 \in (1,2]$, the conditional distribution of $Y$ given that $X = x_0$ is $U[-1,1]$.

Note that we didn't really need to explicitly compute the value of $f_X(x_0)$ in order to determine the conditional density of $Y$.

The conditional mean of $Y$ given $X = x_0$ is trivial to obtain. We have

For each $x_0 \in [-1,2]$, $E[Y\mid X = x_0] = 0$.

The conditional variance of $Y$ given $X=X_0$ is also easy to write down if one remembers the formula $(b-a)^2/12$ or, if not, since $E[Y \mid X = x_0] = 0$, it is easy to compute $\operatorname{var}(Y\mid X = x_0) = E[Y^2\mid X = x_0]$.

For each $x_0 \in [-1,1]$, $\displaystyle \operatorname{var}(Y\mid X = x_0) = \frac{4^2}{12} = \frac 43$.
For each $x_0 \in (1,2]$, $\displaystyle\operatorname{var}(Y\mid X = x_0) = \frac{2^2}{12} = \frac 13$.