Suppose that $X_1$ and $X_2$ are two $i.i.d.$ random variables. Can we prove the following inequality? $$ \mathbb{E}\Big[X_1\Big]^+ \ge \mathbb{E}\bigg[\frac{X_1+X_2}{2} \bigg]^+ $$ where $[a]^+=\max\{a,0\}$.
How about the case without $i.i.d.$ condition?
I know how to do that now. $$ \mathbb{E}\Big[X_1\Big]^+ = \mathbb{E} \frac{|X_1|+X_1}{2}=\frac{1}{2}\mathbb{E}|X_1| + \frac{\mu}{2} $$
$$ \mathbb{E}\bigg[\frac{X_1+X_2}{2} \bigg]^+ = \mathbb{E} \frac{|\frac{X_1+X_2}{2}|+\frac{X_1+X_2}{2}}{2} = \frac{1}{4}\mathbb{E}|X_1+X_2| + \frac{\mu}{2} \\ \le \frac{1}{4}(\mathbb{E}|X_1|+\mathbb{E}|X_2|) + \frac{\mu}{2} \\ = \frac{1}{2}\mathbb{E}|X_1| + \frac{\mu}{2} $$
But how about the case without $i.i.d.$ condition?
HINT:
First: prove (e.g. by case analysis) that $\forall a, b \in \mathbb{R}: [a+b]^+ \le [a]^+ + [b]^+$.
Then: Use linearity of expectation. What does this let you conclude for the case of dependent $X_1, X_2$? (I assume they still have the same distribution, because the comparison is meaningless if $X_2$ has a different distribution and appears only on one side.)