Let $U:[0,1]\rightarrow[0,1]$ be a continuous functions such that $[0,1]$ can be partitioned into a finite no. of intervals such that $U(\cdot)$ is either strictly increasing or strictly decreasing or constant in each of these intervals. $\mu$ is a probability measure on $[0,1]$. It need not be continuous.
I have to show that $\mathbb{E}(x|U(x) \leq u)$ is continuous at $u$ if $\mu(\{U(x)=u\})=0$.
I tried with distributional convergence. I think it can be easily shown that, due to the fact that $\{U(x) \leq u\}$ consists of a finite no. of intervals, $\mu|_{\{U(x) \leq u+\epsilon\}} \implies \mu|_{\{U(x) \leq u\}}$, i.e. distributional convergence if $\mu(\{U(x)=u\})=0$. Then we can use continuity of the expectation operator to conclude. But I'm getting confused trying to show distributional convergence.
Many thanks in advance for your help.
Assume $P(U(x) \leq u) > 0$, otherwise the expectation isn't well defined.
We will also need property that $P(\bigcap\limits_{i=1}^\infty A_i) = \lim\limits_{n \to \infty} P(\bigcap\limits_{i=1}^n A_n)$, so $$\lim\limits_{\varepsilon \to 0}P\left(U^{-1}((u - \varepsilon, u]\right) = \lim\limits_{n\to \infty}P\left(U^{-1}(u - \frac{1}{n}, u]\right) =\\ \lim\limits_{n\to\infty}P\left(\bigcap\limits_{i=1}^n U^{-1}(u - \frac{1}{i}, u]\right) =\\ P\left(\bigcap\limits_{i = 1}^\infty U^{-1}(u - \frac{1}{i}, u]\right) =\\ P\left(U^{-1}(\{u\}\right) $$ And similarly $\lim\limits_{\varepsilon \to 0} P(U^{-1}((u, u + \varepsilon)) = P(U^{-1}(\varnothing)) = 0$.
By law of total expectation, $$\mathbb E(x | U(x) \leq u) = \mathbb E(x | U(x) \leq u - \varepsilon) \cdot \frac{P(U(x) \leq u - \varepsilon)}{P(U(x) \leq u)} + \mathbb E(x | u - \varepsilon < U(x) \leq u) \cdot \frac{P(u - \varepsilon < U(x) \leq u)}{P(U(x) \leq u)}$$
$\mathbb E(x | u - \varepsilon < U(x) \leq u)$ is bounded, $P(u - \varepsilon < U(x) \leq u) \to 0$ as $\varepsilon \to 0$, $\frac{P(U(x) \leq u - \varepsilon)}{P(U(x) \leq u)} \to 1$ as $\varepsilon \to 0$, so we necessary have $$\lim\limits_{\varepsilon \to 0+} \mathbb E(x | U(x) \leq u - \varepsilon) = \mathbb E(x | U(x) \leq u)$$
Similarly, $$E(x | U(x) \leq u + \varepsilon) = E(x | U(x) \leq u) \cdot \frac{P(U(x) \leq u)}{P(U(x) \leq u + \varepsilon)} + E(x | u < U(x) \leq u + \varepsilon) \cdot \frac{P(u < U(x) \leq u + \varepsilon)}{P(u \leq U(x) + \varepsilon)}$$
And again, right part goes to $E(x|U(x) \leq u)$ as $\varepsilon \to 0$, so left part does it too.