Conditional probability density

196 Views Asked by At

I have a question in understanding conditional distributions.

Let $X, Y, Z$ be independent and identically distributed (i.i.d.) random variables and let $U=X+Y$. Can I claim that the conditional distributions satisfy $f_{(U\mid X+Y<b,Z<a)}(u)=f_{(U\mid X+Y<b)}(u)$, where $a, b$ are constants and $f(\cdot)$ is the p.d.f.

If so please demonstrate how to arrive at that claim.

I tried the probability $P(U<u\mid X+Y<b,Z<a)=\frac{P(X+Y<u,X+Y<b,Z<a)}{P(X+Y<b,Z<a)}$, and then

$P(X+Y<b,Z<a)=P(X+Y<b)P(Z<a)$, (I'm not sure about this step)

$P(X+Y<u,X+Y<b,Z<a)=P(X+Y<u,X+Y<b\mid Z<a)\cdot P((Z<a)$. And from substitution and differentiation with respect to $u$ I prove the claim. But I'm not sure if it is correct.

Thank you.

1

There are 1 best solutions below

1
On

You can see this claim directly since $(U|X+Y,Z)=(U|X+Y)$ since you condition it with an independent random variable $Z$.

The following step is true $$P(X+Y<b,Z<a)=P(X+Y<b)P(Z<a).$$

In fact, you use the independence of the random variables $X+Y$ and $Z$.

This comes from the fact that if $X,Y,Z$ are independent, then the random variables $f(X,Y)$ and $Z$ are independent for any measurable function $f:\mathbb{R^2}\to\mathbb{R}$. Here you simply take $f(x,y)=x+y$.

More generally, if $(X_i^j)_{i=1\ldots n,j=1\ldots m}$ is an array of $nm$ independent random variables then for any sequence $(f_i)_{i=1\ldots n}$ of measurable functions $f_i:\mathbb{R^m}\to\mathbb{R}$, we obtain the independence of the sequence of random variables $(Y_i)_{i=1\ldots n}$ given by $$Y_i=f_i(X_i^1,\ldots,X_i^m)\qquad i=1\ldots n.$$