This is a bit of an open-ended question that's been bugging me for a while, and any help or insight would be appreciated. My apologies in advance if I make any math sins, please correct me if so.
Assume $X$ and $Y$ are random variables. Let $M = $ max$(X,Y)$ and let $H = $ max$(X-Y, 0)$.
I'm interested in comparing the variances between $M$ and $H$. To be more precise, I want to know, without necessarily assuming anything about the joint distribution of the two variables, if there is a simple/intuitive way to express either $V(M) - V(H)$, or possibly $\frac{V(M)}{V(H)}$, where V denotes variance.
To give an example of what I mean, I'm able to derive the following for $E(M)-E(H)$, where E denotes expectation.
First, I write: $E(M)=E[X|(X \geq Y)]P(X \geq Y) + E[Y|(X < Y)]P(X < Y)$
Second, I write: $E(H)=E[(X-Y)|(X \geq Y)]P(X \geq Y) = E[X|(X \geq Y)]P(X \geq Y) - E[Y|(X \geq Y)]P(X \geq Y)$
Using the above, I can derive: $E(M)-E(H) = E[Y|(X < Y)]P(X < Y) + E[Y|(X \geq Y)]P(X \geq Y) = E(Y)$
This makes immediate intuitive sense to me, because $M - H$ will always equal $Y$, regardless of where $X$ and $Y$ land. Essentially, I'm looking for a similar bit of intuition, but for $V(M) - V(H)$, where V here denotes variance.
$V(M) - V(H) = (E(M^2) - E(M)^2) - (E(H^2) - E(H)^2)$
Could be rewritten as: $(E(M^2) - E(H^2)) - (E(M)^2-E(H)^2)$
I have some further notes/attempts after this, but nothing that's been useful. When I try to expand it out, I'm getting very long expressions with lots of different terms. I'm wondering if there's a simpler way to think about this problem. I have some intuition that the difference should depend on the expectation of $X-Y$, since in the example case where the joint distribution function of $X, Y$ is such that $Y$ is always greater than $X$, then $H = $ max$(X-Y, 0)$ will always equal $0$, and so its variance will equal $0$. I also suspect the covariance between $X$ and $Y$ should be showing up somewhere.
Please let me know if there's anything I need to clarify. I know "intuitive formula" is subjective so this might be a bad math question, but I've been beating my head against the wall with this one and any help's greatly appreciated.
Update:
I've made some progress, but I'm still struggling with the final expression. With some help from the comments, I've been able to derive the expression:
$Var(M) - Var(H) $
$= Var(Y) + 2 \cdot P(X \geq Y) \cdot \{E[(XY - Y^2)|(X \geq Y)]-E[(X-Y)|(X \geq Y)] \cdot E(Y) \}$
I've written up the steps to reach the expression in an answer and posted it. I'm satisfied with the expression if it's correct.
Second Update:
After working through it some more, I feel confident that the derived expression holds. I was struggling with an example case that didn't seem to add up for awhile, but it turns out I was making a mistake in the computation. I'll include the example case I was looking at below, just if anyone's interested.
Example Case:
Let $X,Y$ be jointly distributed such that $f(x,y)=1$ on the square $0<x<1$ and $0<y<1$. Let $M = max(X,Y)$ and let $H = max(X-Y, 0)$.
Can calculate the following:
$Var(Y) = \frac{1}{12}$
$P(X \geq Y) = \frac{1}{2}$
$E[(XY-Y^2)|(X \geq Y)] = \int_{0}^{1} \int_{0}^{x} xy - y^2 \,dy \,dx \cdot \frac{1}{P(X \geq Y)} = (\frac{1}{8} - \frac{1}{12}) \cdot 2 = \frac{1}{12}$
$E[(X-Y)|(X \geq Y)] = \int_{0}^{1} \int_{0}^{x} x-y \,dy \,dx \cdot \frac{1}{P(X \geq Y)} = \frac{1}{6} \cdot 2 = \frac{1}{3}$
$E(Y) = \frac{1}{2}$
Then, using the previously derived expression:
$Var(M) - Var(H)$
$= \frac{1}{12} + 2 \cdot \frac{1}{2} \cdot (\frac{1}{12} - \frac{1}{3} \cdot \frac{1}{2})$
$= 0$
This now lines up with the expression when I do the calculations of $Var(M)$ and $Var(H)$ directly. Simulating values seems to suggest that both $Var(M)$ and $Var(H)$ equal $\frac{1}{18}$.
A second/direct method for calculating the variances:
For $M = max(X,Y)$, I use the fact that $X$ and $Y$ are independent:
$F_M (m) = P(M \leq m) = P(X \leq m, Y \leq m) = P(X \leq m) \cdot P(Y \leq m)$, on $0 < m < 1$
Since $F_X (x) = x$ and $F_Y (y) = y$, then we have:
$P(X \leq m) \cdot P(Y \leq m) = F_X (m) F_Y (m) = m^2$, on $0 < m < 1$
So, $F_M (m) = m^2$ and we can derive the pdf:
$f_M (m) = 2m$, on $0 < m < 1$
From there we can derive that $E(M) = \frac{2}{3}$ and $E(M^2) = \frac{1}{2}$, so $Var(M) = \frac{1}{18}$
For $H = max(X-Y, 0)$, I first used a change of variables to find the density function of $X-Y$.
I believe we can write:
$U = \phi_1 (X,Y) = X - Y$ and $V = \phi_2 (X,Y) = X$.
Then, $X = \psi_1 (U,V) = V$ and $Y = \psi_2 (U,V) = V-U$.
Then $|J|$ should equal 1, and we can write the density function of $U$ and $V$ as:
$g(u,v) = 1$ on the bounds: $0 < v < 1$ and $v-1 < u < v$
I then derive the density function of $U$:
$g_U (u) = \int_{0}^{u+1} 1\,dv = u+ 1$ on $-1 < u < 0$
$g_U (u) = \int_{u}^{1} 1\,dv = 1 - u$ on $0 < u < 1$
I believe we can then take the function $H = h(u) = max(U, 0)$ and hopefully derive the first two moments like this:
$E(H) = \int_{-1}^{0} 0 \cdot (u+ 1) \,du + \int_{0}^{1} u \cdot (1-u) \,du = \frac{1}{6}$
$E(H^2) = \int_{-1}^{0} 0^2 \cdot (u+ 1) \,du + \int_{0}^{1} u^2 \cdot (1-u) \,du = \frac{1}{12}$
From there I calculate $Var(H) = \frac{1}{12} - \frac{1}{6}^2 = \frac{1}{18}$
Thanks so much to everyone who helped me with this. I really appreciate it!
Since $M = max(X,Y)$ and $H=max(X-Y, 0)$, we have the relationship $M - H = Y$.
Then, $M = H + Y$.
Then, $Var(M) = Var(H + Y) = Var(H)+Var(Y)+2Cov(H,Y) $.
This yields an expression for $Var(M)−Var(H)$.
I want to get this just in terms of X and Y, so I break out the covariance term a bit.
$Cov(H,Y)=E(HY)−E(H)(EY)$
I believe I can write:
$E(H)$
$=E[(X−Y)|(X≥Y)]P(X≥Y)+E[0|(X<Y)]P(X<Y)$
$=E[(X−Y)|(X≥Y)]P(X≥Y)$
$E(HY)$
$=E[HY|(X≥Y)]P(X≥Y)+E[HY|(X<Y]P(X<Y)$
$=E[(XY−Y^2)|(X≥Y)]P(X≥Y)+E[0|(X<Y]P(X<Y)$
$=E[(XY−Y^2)|(X≥Y)]P(X≥Y)$
So then I can find the expression:
$Var(M)−Var(H)$
$=Var(Y)+2Cov(H,Y)$
$=Var(Y)+2[E(HY)−E(H)E(Y)]$
$=Var(Y)+2\{E[(XY−Y^2)|(X≥Y)]P(X≥Y)−E[(X−Y)|(X≥Y)]P(X≥Y)E(Y)\}$
$=Var(Y)+2⋅P(X≥Y)⋅\{E[(XY−Y^2)|(X≥Y)]−E[(X−Y)|(X≥Y)]⋅E(Y)\}$
Which is the simplest way I've figured out how to write it.