I would like to know the condition for a random variable $Y$ in order to make $\mathbb{E}[\max\{X_1+Y,X_2\}] > \mathbb{E}[\max\{X_1, X_2\}]$, where $X_1$ and $X_2$ are iid.
Any help would be appreciated.
Comment by OP incorporated by dfeuer
I tried to use the upper and lower bounds of the highest order statistics for inid and iid random variables to solve the problem, but they are not tight. The brute force might be applying the convolution on the sum term, then the cdf of the highest order statistic for inid random variables.
If you start off by subtracting $\mathrm{E}[X_1]$ from both sides and let $Z=X_2-X_1$, the desired inequality becomes $$\mathrm{E}[\max(Y,Z)]>\mathrm{E}[\max(0,Z)].$$
For any distribution $U$ with $F_U(t)=\Pr[U\le t]$ the cumulative distribution, we can write $$ \mathrm{E}[U]=\int_{-\infty}^\infty \chi(t>0)-F_U(t)\,dt $$ where $\chi(t>0)$ the indicator function which is 1 when $t>0$ is true and 0 when it is false.
Using $F_{\max(Y,Z)}(t)=F_Y(t)F_Z(t)$, the inequality becomes equivalent to $$ \int_{-\infty}^\infty \chi(t>0)-F_Y(t)F_Z(t)\,dt >\int_0^\infty 1-F_Z(t)\,dt $$ which we can rewrite into $$ \int_{-\infty}^\infty \Big[\chi(t>0)-F_Y(t)\Big]\cdot F_Z(t)\,dt>0. $$ I can't quite see any way of transforming this into a simple statistical statement: i.e. like one in terms of expected values or probabilities. However, if we define $$ H_Z(t)=\int_0^t F_Z(s)\,ds $$ with the convention that $\int_0^a=-\int_a^0$ to deal with the $a<0$ cases, we can rewrite the integral condition into $$ \textrm{E}\left[H_Z(Y)\right]=\int_{-\infty}^\infty H_Z(t)\,dF_Y(t)>0 $$ where $dF_Y(t)=f_Y(t)\,dt$ if the probability density of $Y$.