Suppose that we have $n$ independently and identically distributed variables $X_1, \cdots, X_n$ which we observe with zero mean measurement error. More precisely, we observe $X^*_i = X_i + U_i$ where the errors $U_1, \cdots, U_n$ are independently and identically distributed, $E[U_i] = 0$ and every $U_i$ is independent of every $X_i$. I need to show that $$E[U_i \mid X^*_i \geq X^*_j \hspace{0.1cm}\forall j] \geq 0.$$
Intuitively, this is obvious: one reason why $X_i + U_i$ could be maximal is that $U_i$ is high, which in turn suggests that the expected value of $U_i$ is high given that $X_i + U_i$ is maximal. However, I am struggling to show this formally and would be grateful for some help.
Here is my 'progress' so far (I may have over-complicated matters, if so please point me towards a better way forward):
First, I note that by Bayes' theorem, we can write the density of $U$ given that $X + U$ takes some particular value $c$:\begin{align*} f_{U \mid U+X=c}(u) &= \frac{f_{U+X}(c \mid U=u)f_U(u)}{f_{U+X}(c)} = \frac{f_{X}(c-u)f_U(u)}{f_{U+X}(c)}\\ &= \frac{f_X(c-u)f_U(u)}{\displaystyle \int_{-\infty}^{\infty}f_U(k)f_X(c-k)\,\mathrm{d}k}. \end{align*}
From this, I get the expected value of $U$ given that $X+U=c$: $$E[U \mid U+X=c]= \int_{\underline{u}}^{\bar{u}}f_{U \mid U+X=c}(u)u\,\mathrm{d}u = \int_{\underline{u}}^{\bar{u}}\frac{f_X(c-u)f_U(u)}{\displaystyle\int_{-\infty}^{\infty}f_U(k)f_X(c-k)\,\mathrm{d}k}(u)u\,\mathrm{d}u,$$ where $\bar{u}$ and $\underline{u}$ bound the support of $U$ (I will assume a finite support, but presumably the argument goes through without this assumption?)
This tells us the expected value of $U_i$ given that $X_i + U_i = c$. However, we wanted to know the expected value of $U_i$ given that $X_i + U_i$ is maximal. Fortunately, we know the distribution of $X_i + U_i$ given that it is maximal, i.e. the distribution of the 'highest order statistic' of $X_i + U_i$: $$f_{X + U \mid X + U \text{ is maximal}}(c) = \left(\int_{-\infty}^{\infty}f_U(k)f_X(c-k)\,\mathrm{d}k\right)^n.$$
By the law of iterated expectations, we thus obtain:\begin{align*} &\mathrel{\phantom{=}}{} E[U_i \mid X^*_i \geq X^*_j\ \forall j]\\ &= \int{E[U \mid U+X=c]}f_{X+U \mid X + U \text{ is maximal}}(c)\,\mathrm{d}c\\ &= \int\Biggl(\int_{\underline{u}}^{\bar{u}}\frac{f_X(c-u)f_U(u)}{\displaystyle\int_{-\infty}^{\infty}f_U(k)f_X(c-k)\,\mathrm{d}k}u\,\mathrm{d}u\Biggr)\left(\int_{-\infty}^{\infty}f_U(k)f_X(c-k)\,\mathrm{d}k\right)^n \,\mathrm{d}c \end{align*}
It thus seems that I have found the expression I am after. However, when looking at this rather complicated expression, I see no way to show that it is non-negative (the whole point of the exercise!) If you can see a way, or can think of a simpler approach than the one I have adopted, please do let me know.
Thanks again in advance!
Proof: Define $\mathscr{G} = \{C \in \mathscr{B}(\mathbb{R}^2) \mid \text{If } g(x) = E(I_C(x, Y)) \text{, then } E(I_C(X, Y) \mid X) \aseq g(X)\}$. For any $A, B \in \mathscr{B}(\mathbb{R})$, because $I_B(Y) = I_{\{Y \in B\}}$ is independent from $X$, then$$ E(I_{A × B}(X, Y) \mid X) = E(I_A(X) I_B(Y) \mid X) \aseq I_A(X) E(I_B(Y) \mid X) \aseq I_A(X) E(I_B(Y)). $$ For $x \in \mathbb{R}$, note that $I_A(x)$ is a constant, thus$$ E(I_{A × B}(x, Y)) = E(I_A(x) I_B(Y)) = I_A(x) E(I_B(Y)). $$ Therefore, $A × B \in \mathscr{G}$. Note that $σ(\mathscr{B}(\mathbb{R}) × \mathscr{B}(\mathbb{R})) = \mathscr{B}(\mathbb{R}^2)$, and it is easy to verify that $\mathscr{G}$ is a σ-algebra, thus $\mathscr{B}(\mathbb{R}) × \mathscr{B}(\mathbb{R}) \subseteq \mathscr{G}$ implies that $\mathscr{G} = \mathscr{B}(\mathbb{R}^2)$.
Proof: Under the given conditions,\begin{align*} E(X g(X)) &= \int\limits_{\Ω} X g(X) \,\d P = \int\limits_{\{X > 0\}} X g(X) \,\d P - \int\limits_{\{X < 0\}} (-X) g(X) \,\d P\\ &\geqslant \int\limits_{\{X > 0\}} X g(0) \,\d P - \int\limits_{\{X < 0\}} (-X) g(0) \,\d P = g(0) E(X) = 0. \end{align*}
Now back to the problem. Assume instead that $X_1, \cdots, X_n$ are independent (not necessarily i.i.d.), $U_1, \cdots, U_n$ are independent (again not necessarily i.i.d.) with $E(U_k) = 0$ for $1 \leqslant k \leqslant n$, and $σ(X_1, \cdots, X_n)$ and $σ(U_1, \cdots, U_n)$ are independent.
Note that$$ A_i := \{X_i + U_i = \max\limits_{1 \leqslant j \leqslant n} (X_j + U_j)\} = \{X_i + U_i \geqslant \max\limits_{j ≠ i} (X_j + U_j)\}. $$ Define$$ g(u) = P(X_i + u \geqslant \max\limits_{j ≠ i} (X_j + U_j)), $$ then $g$ is increasing. By lemma 1 and lemma 2,$$ E(U_i I_{A_i}) = E(E(U_i I_{A_i} \mid U_i)) = E(U_i E(I_{A_i} \mid U_i)) = E(U_i g(U_i)) \geqslant 0, $$ thus $E(U_i \mid A_i) = \dfrac{E(U_i I_{A_i})}{P(A_i)} \geqslant 0$.