Proof regarding conditional expectation of sum of minima

94 Views Asked by At

Assume $X_1, X_2$ are continuous bounded random variables, where the dependency between them is unknown, where the joint and marginal density functions are well-defined.

Let \begin{align} I_1 := \min(X_1, \overset{\sim}{a_1}), \quad I_2 := \min(X_2,\overset{\sim}{a_2}), \end{align} where $\overset{\sim}{a_1}, \overset{\sim}{a_2}$ are non-negative numbers. Let $\alpha \in (0, 1)$ be fixed, and let $F_{I_1+I_1}^{-1}(\alpha)$ denote the $\alpha$-quantile of $I_1+I_2$. Moreover, define $[t]^+ := \max(t, 0)$.

I am trying to show that given $a_1, a_2, \overset{\sim}{a_1}, \overset{\sim}{a_2}$ such that \begin{align} a_1+a_2 = F_{I_1+I_2}^{-1}(\alpha) \end{align} and

\begin{align} \overset{\sim}{a_1}+ \overset{\sim}{a_2} > F_{I_1+I_2}^{-1}(\alpha), \end{align} with $a_1 < \sup X_1, a_2 < \sup X_2$, then

\begin{align} & E[I_1+I_2 \ | \ I_1 + I_2 \geq F_{I_1+I_2}^{-1}(\alpha)] + E\left[ [X_1 -\overset{\sim}{a_1}]^+\right] + E\left[ [X_1 -\overset{\sim}{a_2}]^+\right]\\ & \leq a_1 + a_2 + E\left[ [X_1 -a_1]^+\right] + E\left[ [X_1 -a_2^+\right]. \end{align}

I have not been able to contradict this using Monte Carlo simulation, but I cannot seem to prove it formally. I have considered

\begin{align} E[I_1+I_2 \ | \ I_1 + I_2 \geq F_{I_1+I _2}^{-1}(\alpha)] &= \frac{1}{1-\alpha} \int_{\mathbb{R}} x f_{I_1+I_2}(x) \chi_{\{x \geq F_{X_1+X_2}^{-1}(\alpha)\}}\ dx\\ &= \frac{1}{1-\alpha} \left( \int_{\mathbb{R}} x f_{I_1+I_2}(x) \chi_{ \{\overset{\sim}{a_1}+\overset{\sim}{a_2} > x \geq F_{I_1+I_2}^{-1}(\alpha)\}}\ dx + (\overset{\sim}{a_1} +\overset{\sim}{a_2})( 1-F_{I_1+I_2}(\overset{\sim}{a_1} +\overset{\sim}{a_2})) \right). \end{align}

The integrand is clearly larger than $a_1+a_2$ everywhere, so it evaluates to a larger value than $a_1+a_2$. At the same time, $E[ [X_1-a_1]^+]$ and $E[ [X_2-a_2]^+]$ are decreasing in $a_1$ and $a_2$, but as we don't make assumptions on $a_1, a_2$ but on the sum $a_1+a_2$, the change in these expectations would intuitively depend on the tails of the distributions of $X_1$, $X_2$.

Is there a way forward with this problem? Or is my setting too general? Do I need to assume independence between $X_1$ and $X_2$ to get anywhere?

Edit: Another consideration - The terms \begin{align} E[[X_1-\overset{\sim}{a_1} ]^+] + E[X_2-\overset{\sim}{a_2}]^+] & \geq E[[X_1 + X_2 - (\overset{\sim}{a_1} +\overset{\sim}{a_2})]^+], \end{align} by convexity of the function $[\cdot]^+$.

On the other hand, \begin{align} & E[[X_1-a_1 ]^+] + E[X_2-a_2]^+] - E[[X_1-\overset{\sim}{a_1} ]^+] + E[X_2-\overset{\sim}{a_2}]^+]\\ &= \int_{a_1}^{\overset{\sim}{a_1}} xf_{X_1}(x) \ dx + \int_{a_2}^{\overset{\sim}{a_2}} xf_{X_2}(x) \ dx, \end{align}

so I need to show that \begin{align} E[I_1+I_2 \ | \ I_1 + I_2 \geq F_{I_1+I_2}^{-1}(\alpha)] - F^{-1}_{I_1+I_2}(\alpha) \geq \int_{a_1}^{\overset{\sim}{a_1}} xf_{X_1}(x)\ dx + \int_{a_2}^{\overset{\sim}{a_2}} xf_{X_2}(x) \ dx, \end{align}