link between cdf and expectation - learning the basics

271 Views Asked by At

Let $X$, $Y$ be random variables. Assume that the cumulative distribution function $F$ of $X$ and $Y$ satisfy the following relation: $$F_X(x) > F_Y(x) \hspace{2mm} \forall x$$

Is it possible to conclude that $E[X] > E[Y]$?

My idea: Let $f_X(x), f_Y(x)$ respectivly be the probability density function of $F_X(x), F_Y(x)$. Then

$ F_X(x) > F_Y(x) \hspace{2mm} \forall x \Rightarrow \int_{-\infty}^x f_X(x) > \int_{-\infty}^x f_Y(x) \hspace{2mm} \forall x \Rightarrow f_X(x) > f_Y(x) \hspace{2mm} \forall x\Rightarrow E[X] > E[Y]$

2

There are 2 best solutions below

5
On BEST ANSWER

Assuming the existence of $\mathbb{E}[X]$ and $\mathbb{E}[Y]$, we can conclude that $\mathbb{E}[X] < \mathbb{E}[Y]$. An intuitive explanation is that $F_X(x) > F_Y(x)$ can be read as follows: $X \leq x$ will more likely to happen than $Y \leq x$.

Indeed, this easily follows once we prove the following formula

$$ \mathbb{E}[X] = \int_{\mathbb{R}} \left( \mathbf{1}_{[0,\infty)}(x) - F_X(x) \right) \, \mathrm{d}x. $$

Assuming this, we have

$$ \mathbb{E}[Y] - \mathbb{E}[X] = \int_{\mathbb{R}} \left( F_X(x) - F_Y(x) \right) \, \mathrm{d}x > 0, $$

where the last step follows from the fact that we are integrating positive function. So it remains to prove the formula. To this end, it is convenient to write $X = X^+ - X^-$, where $X^+ = \max\{X, 0\}$ is the positive part of $X$ and $X^- = \max\{-X, 0\}$ is the negative part of $X$. Then for each part,

$$ \mathbb{E}[X^+] = \mathbb{E}\left[ \int_{0}^{\infty} \mathbf{1}_{\{ x < X \}} \, \mathrm{d}x \right] = \int_{0}^{\infty} \mathbf{E}\left[ \mathbf{1}_{\{ x < X \}} \right] \, \mathrm{d}x = \int_{0}^{\infty} \left( 1 - F_X(x) \right) \, \mathrm{d}x $$

and

$$ \mathbb{E}[X^-] = \int_{0}^{\infty} \mathbf{E}\left[ \mathbf{1}_{\{ x \leq -X \}} \right] \, \mathrm{d}x = \int_{-\infty}^{0} \mathbf{E}\left[ \mathbf{1}_{\{ X \leq x \}} \right] \, \mathrm{d}x = \int_{-\infty}^{0} F_X(x) \, \mathrm{d}x. $$

1
On

In addition to the comments above I would like to point out that the inequality is actually in the opposite dierction for non-negative random varibales: $F_X(x) \geq F_Y(x)$ implies $EX=\int_0^{\infty} (1-F_X (x)) dx \leq \int_0^{\infty} (1-F_Y (x)) dx=EY$.