Consider two RV's $X_1$ and $X_2$, independently and uniformly distributed over $[0,1]$.
I want to find the largest possible area $A\subset [0,1]^2$ while keeping the average of $X_1$ and $X_2$ inside of $A$ above a certain level, say $y\in(1/2,1)$. For $y\leq 1/2$, it is easy to see that $A=[0,1]^2$ solves the problem, and that's why I excluded $y$'s below $1/2$.
So, the programming problem should be
$$\max_{A\in[0,1]^2}\int_Adx_1dx_2$$ $$s.t. E[X_1|(X_1,X_2)\in A]\geq y~\textrm{and}\\E[X_2|(X_1,X_2)\in A]\geq y.$$
Is there a general solution to this problem or any reference I can have a look to find similar types of problems?
Adding the two constraints gives $E[X_1 + X_2 \mid (X_1, X_2) ∈ A] ≥ 2y$. The optimal $A$ under only this weaker constraint is clearly the region $x_1 + x_2 ≥ c$ for appropriately chosen $c$. Specifically,
But since this $A$ is symmetric in $x_1$ and $x_2$, it happens to satisfy the original stronger constraints—so it’s optimal under them as well.