Covariance of max and min of independent uniform random variables

40 Views Asked by At

For $X$, $Y$ independent uniform random variables distributed $U[0,1]$, let $H = \max(X,Y)$, $L = \min(X,Y)$. We know that:

  1. $\operatorname{Cov}\left\{\min(X,Y),\max(X,Y)\right\} \neq \operatorname{Cov}\{X,Y\}$.
  2. $\mathbb{E}\left\{\min(X,Y)\cdot \max(X,Y)\right\} = \mathbb{E}\left\{XY\right\}$.
  3. $\min(X,Y)+\max(X,Y)=X+Y$.

My question is why is (1) not equal, but (2) and (3) are? Specifically, (1) and (2) would seem very analogous to me. Any suggestions on either a formal proof or mathematical intuition is greatly appreciated!

2

There are 2 best solutions below

0
On BEST ANSWER

One of $L$ and $H$ is $X$ and the other is $Y$, so $L + H = X + Y$, and similarly $L H = X Y$ (and so also $\mathbb E[L + H] = \mathbb E[X + Y]$ and $\mathbb E[L H] = \mathbb E[X Y]$).

Now $\text{Cov}(X,Y) = \mathbb E[XY] - \mathbb E[X]\; \mathbb E[Y]$ and $\text{Cov}(L, H) = \mathbb E[L H ] - \mathbb E[L] \;\mathbb E[H]$. The difference between these is

$$ \text{Cov}(X,Y) - \text{Cov}(L, H) = \mathbb E[X]\; \mathbb E[Y] - \mathbb E[L]\; \mathbb E[H]$$

But in this case $\mathbb E[X] = \mathbb E[Y] = 1/2$ while for some $c > 0$, $\mathbb E[L] = 1/2 - c$ and $\mathbb E[H] = 1/2 + c$ (their sum, as noted, is $1$ and with probability $1$, $L < H$). So

$$ \text{Cov}(X,Y) - \text{Cov}(L, H) = \frac{1}{4} - \left(\frac{1}{2} - c\right) \; \left(\frac{1}{2} + c\right) = c^2 > 0 $$

0
On

As (3) is trivial, I will demonstrate (2) and then (1).

Observe that as $X$ and $Y$ are independent, $$\mathbb{E}\{XY\}=\mathbb{E}\{X\}\mathbb{E}\{Y\}=\frac{1}{2}\cdot\frac{1}{2}=\frac{1}{4}$$ and $$\operatorname{Cov}\{X,Y\}=\mathbb{E}\{XY\}-\mathbb{E}\{X\}\mathbb{E}\{Y\}=0\,.$$ Now, $L$ and $H$ are order statistics of $n=2$ uniformly distributed random variables, and hence are NOT independent. However, we know they follow a joint distribution $$f_{L,H}(\ell,h)=n(n-1)(h-\ell)^{n-2} = 2\,,\quad 0\leq \ell \leq h \leq 1\,.$$

Thus, \begin{align} \mathbb{E}\left\{LH\right\}&=\int_{\ell=0}^1\int_{h=\ell}^1 2h\ell\,\mathrm{d}h \mathrm{d}\ell\\ &=\int_{\ell=0}^1 h^2\big|_{\ell}^1\cdot\ell\,\mathrm{d}\ell\\ &=\int_{\ell=0}^1 (1-\ell^2)\ell\,\mathrm{d}\ell\\ &=\Big(\frac{1}{2}\ell^2-\frac{1}{4}\ell^4\Big)\Big|_1^1\\ &=\frac{1}{2}-\frac{1}{4}\\ &=\frac{1}{4} \end{align}

For the covariance, note that $L$ and $H$ follow Beta distributions as follows: \begin{align} H&\sim \operatorname {Beta} (2,1)\quad\Rightarrow&\mathbb{E}\{H\}&=\frac{2}{2+1}=\frac{2}{3}\,,\\ L&\sim \operatorname {Beta} (1,2)\quad\Rightarrow&\mathbb{E}\{L\}&=\frac{1}{1+2}=\frac{1}{3}\,,\\ \end{align}

and hence $$\operatorname{Cov}\{L,H\}=\mathbb{E}\{LH\}-\mathbb{E}\{L\}\mathbb{E}\{H\}=\frac{1}{4}-\frac{2}{3}\cdot\frac{1}{3}=\frac{1}{4}-\frac{2}{9}=\frac{1}{36}\,.$$