Let $X$ denote a random variable that is smoothly distributed on $[0, 1]$ with PDF $f$. Consider $$g(c) = \mathbb{P}(X< c) \phi(\mathbb{E}[X|X < c]) + \int_c^1 \phi(x)f(x) dx$$ where $\phi$ is some increasing and concave function. If we set $c = 0$, this reduces to $$g(0) = \int_0^1 \phi(x)f(x) dx = \mathbb{E}[\phi(X)]$$ Meanwhile, if we set $c = 1$, this reduces to $$g(1) = \phi(\mathbb{E}[X|X < 1]) = \phi(\mathbb{E}[X])$$ We can use Jensen's inequality to compare these. Indeed, since $\phi$ is concave, $g(1) = \phi(\mathbb{E}[X]) > g(0) = \mathbb{E}[\phi(X)]$. This suggests that $g(c)$ should be strictly increasing in $c$ over the full interval $[0, 1]$. Intuitively, increasing $c$ places more weight on the second term, which is larger. Is this indeed the case?
N.B. My attempt was as follows. If we differentiate with respect to $c$, the first term of $g(c)$ becomes (from the product rule) $$P(X < c) \phi'(\mathbb{E}[X|X < c])\frac{\partial \mathbb{E}[X|X < c]}{\partial c} + \phi(\mathbb{E}[X|X < c])f(c)$$ If we use this result, our expression is equal to \begin{equation} \begin{split} &P(X < c) \phi'(\mathbb{E}[X|X < c])\frac{f(c)\left(c -\mathbb{E}[X | X < c]\right)}{P(X < c)} + \phi(\mathbb{E}[X|X < c])f(c) \\ = & \phi'(\mathbb{E}[X|X < c])f(c)\left(c -\mathbb{E}[X | X < c]\right) + \phi(\mathbb{E}[X|X < c])f(c) \end{split} \end{equation} Meanwhile, when we differentiate $g(c)$, the second term becomes $- \phi(c)f(c)$ using the Leibniz rule. So the derivative is $$ g'(c) = \phi'(\mathbb{E}[X|X < c])f(c)\left(c -\mathbb{E}[X | X < c]\right) + \phi(\mathbb{E}[X|X < c])f(c) - \phi(c)f(c)$$ which has the same sign as (dividing by $f(c) > 0$) $$ \phi'(\mathbb{E}[X|X < c])\left(c -\mathbb{E}[X | X < c]\right) + \phi(\mathbb{E}[X|X < c]) - \phi(c) $$ It is clear that $c > \mathbb{E}[X | X < c]$, so the first term is positive. Also, $\phi(\mathbb{E}[X|X < c]) - \phi(c) < 0$. However, I cannot get any further.
Because $\phi$ is concave, linear approximations overestimate the function: $$\phi(b) \le \phi(a) + \phi'(a) (b-a).$$ Now plug in $b=c$ and $a=E[X \mid X < c]$.