Expected value of a maximum of two draws compared to expected value of each

121 Views Asked by At

I am no mathematician, so I apologise in advance for not explaining myself properly, and for asking something that is probably utterly obvious for most of you.

The question has to do with the expected value of the maximum of two draws from two different distribution. The result seems intuitive to me, but I have looked for a theorem or a proof on this website and others, and I could not find what I was looking for.

Let $X$ and $Y$ be two random variables. Ideally I do not want to have to assume that they are drawn from the same distribution, although I guess there needs to be an assumption of common support to have the results below hold with strict inequality. So for simplicity let's say that the two random variables are normals (which of course guarantees common support) but may have different means and variances: $X \sim {N}(\mu_x, \sigma^2_x)$ and $Y \sim {N}(\mu_y, \sigma^2_y)$. I think the results below should not need the distribution to be normal, just the support to overlap somewhere, but assuming normality is totally fine in my context. I assume the two draws to be independent.

All I need to show are these three things:

$E[\text{max}(X,Y)]>E(X)$ (because of common support I think this equation and the next one holds with inequality)

$E[\text{max}(X,Y)]>E(Y)$

and

$E[\text{max}(X,Y)]<E(X)+E(Y)$

intuitively all these three things are rather clear to me, but I could not find hard proof of that. Are there theorems or simple steps that I could follow to prove that? Using uniform distributions for both random variables or using the same distribution for both variables I can solve it, but it feels that this is more general result than that, and so it feels unsatisfactory to make those assumptions.

Thank you so much for your help, and let me also congratulate all of you that have been so helpful to me and to so many others with your answers. All the best,

Michele

1

There are 1 best solutions below

2
On

The last inequality is untrue for general distributions. Suppose $X$ and $Y$ are +1 or -1 independently with probability $\frac12$. Then $E[X]+E[Y]=0+0=0$. But $E[\max(x,y)]=\frac14\cdot-1+\frac34\cdot1=\frac12$.

For normally distributed random variables both with mean 0, I would say it is also untrue (In that case it conflicts with the first two inequalities).

Therefore, the last inequality is incorrect.

If however $X,Y>0$, then $\max(X,Y)< X+Y$, and therefore $E[\max(X,Y)]< E[X+Y]=E[X]+E[Y]$.