Independent and identically distributed random variables.

300 Views Asked by At

Let $X$ and $Y$ be two i.i.d random variables. I am trying to prove that $\mathbb{P}(X<Y)=\mathbb{P}(Y<X)$. The author claims that this is true by symmetry. But I am trying to prove this in a rigorous way.

I haven't studied joint distributions yet. Here is my attempt for the proof.

Let $\Omega$ be the sample space of interest. Then we have the following.

$\mathbb{P}(X<Y)=\mathbb{P}\left(\{\omega\in\Omega:X(\omega)<Y(\omega)\}\right)$

$\mathbb{P}(Y<X)=\mathbb{P}\left(\{\omega\in\Omega:Y(\omega)<X(\omega)\}\right)$

One way to show that $\mathbb{P}(X<Y)=\mathbb{P}(Y<X)$ is to show that the following equation holds:

$$\begin{equation*}\mathbb{P}\left(\{\omega\in\Omega:X(\omega)<Y(\omega)\}\right)=\mathbb{P}\left(\{\omega\in\Omega:Y(\omega)<X(\omega)\}\right)\end{equation*}\cdots\cdots(1)$$

However, I couldn't continue further using the above approach. Instead, I am using the following method, but it works only if $X$ and $Y$ are discrete random variables.

Let $A:=\{x\in\mathbb{R}:X(x)>0\}$. Then $A$ is the support of $X$, and also $Y$ since $X$, $Y$ are i.i.d.

Let $B:=\{(x,y)\in A\times A:x<y\}$.

We then have

$\begin{equation*}\begin{split}\mathbb{P}(X<Y)&=\sum_{(x,y)\in B}\mathbb{P}(X=x,Y=y)\\&=\sum_{(x,y)\in B}\mathbb{P}(X=x)\mathbb{P}(Y=y)\quad\quad\quad \left(X,Y \text{ are independent}\right)\\ &=\sum_{(x,y)\in B}\mathbb{P}(Y=x)\mathbb{P}(X=y)\quad\quad\quad \left(X,Y \text{ are identically distrbuted}\right)\\&=\mathbb{P}(Y<X)\end{split}\end{equation*}$

The above proof holds when $X$ and $Y$ are discrete. But how do I approach this problem when $X$ and $Y$ are continuous random variables? Also, is it possible prove the above fact by directly showing that the equation $(1)$ holds?

2

There are 2 best solutions below

0
On

$P(X<Y) =\int_{-\infty}^{\infty}\int_{-\infty}^y f(x) f(y) \;dx \;dy$

Swap the order of integration

$\int_{-\infty}^{\infty}\int_{-\infty}^x f(x) f(y) \;dy \;dx = P(Y<X)$

0
On

Since $X$ and $Y$ are independent and have the same distribution, the characteristic function satisfies $$\varphi_{X-Y}(t)=\varphi_X(t)\varphi_{-Y}(t)=\varphi_X(t)\varphi_{-X}(t)$$ and $$\varphi_{Y-X}(t)=\varphi_Y(t)\varphi_{-X}(t)=\varphi_X(t)\varphi_{-X}(t).$$ Hence $\varphi_{X-Y}(t)=\varphi_{Y-X}(t)$ so $X-Y$ and $Y-X$ have the same distribution.