Suppose $u(\cdot)$ and $v(\cdot)$ are differentiable, strictly increasing, and strictly concave real functions. Specifically, $v(\cdot)$ is "more concave" than $u(\cdot)$ in the sense that there exists an increasing and strictly concave function $\phi(\cdot)$ such that $v(x)=\phi(u(x))$ at all $x$. It is also equavelent to \begin{equation} \frac{v''(x)}{v'(x)}<\frac{u''(x)}{u'(x)} \textrm{ for any }x\,. \end{equation}
Let $f(\cdot)$ and $g(\cdot)$ be (well-behaved) two fixed real functions on a fixed probability measure space ($\Sigma, H, Pr$). The $\sigma-$algebra satisfies the usual conditions. Assume \begin{equation} \int h(\cdot)f(\cdot) < \int h(\cdot)g(\cdot)\,, \end{equation} and \begin{equation} \int h(\cdot)u(f(\cdot)) = \int h(\cdot)u(g(\cdot))\,, \end{equation} where $h(\cdot)$ is the density function over the probability space.
Conjecture: \begin{equation} \int h(\cdot)v(f(\cdot)) > \int h(\cdot)v(g(\cdot))\,. \end{equation}
Update: thanks to @zhoraster who provided a potential and very clever answer, see below. That answer seems to change my initial question a little bit by considering two different probability distributions on the two sides of the inequalities (in the counter-example he gave). I accordingly updated the wording of my initial conjecture to make it mathematically more precise --- the probability distribution is fixed on both sides.
Also, I'm not sure if the following (weaker) conjecture is true or not: Using Jensen's inequality to prove another inequality?
Take any two probability distributions $f$ and $g$ such that $$ E[f] = E[g]\tag{1} $$ and $$ E[u(f)] = E[u(g)]\tag{2} $$
Then your conjecture would imply$^*$ that $$ E[v(g)] = E[v(f)]\tag{3} $$ for any $v$, which is "more concave" than $u$, consequently, $f$ and $g$ are identically distributed. So it follows from your conjecture that (1) and (2) imply the equality in distribution. Obviously, this can't be true.
Similarly, your another conjecture is false, one just needs to adjust the counterexample a little bit to make the probabilities equal.
Here is a particular example:
The distribution of $f$: $6$ with probability $0.01$, $-6$ with probability $0.01e^{-6}$, $0$ with probability $1-0.01-0.01 e^{-1}$
The distribution of $g$: $1$ with probability $0.1$, $-1$ with probability $0.1e^{-1}$, $0$ with probability $1-0.1-0.1 e^{-1}$
Then $$ E[f] = 0.06-0.06e^{-6}< 0.1-0.1e^{-1} = E[g],\\ E[-e^{-f}] = -0.01e^{-6} - (1-0.01-0.01 e^{-6}) - 0.01e^{-6}\cdot e^6 = -1,\\ E[-e^{-g}] = -0.1e^{-1} - (1-0.1-0.1 e^{-1}) - 0.1e^{-1}\cdot e = - 1, $$ but $$ E[-e^{-2f}] = -0.01e^{-6} -(1-0.01-0.01 e^{-6}) - 0.01e^{-6}\cdot e^{12} \\ < E[-e^{-2g}]= -0.1e^{-1} -(1-0.1-0.1 e^{-1}) - 0.1e^{-1}\cdot e^{2}. $$
One can make a small adjustment to this example by dividing the probabilities into many small and equal ones, which would give a counterexample to your other conjecture.
$^*$ Perturb $f$ a little bit so that (2) still holds, but in (1), the sign changes to $<$. Provided that your conjecture is true, the sign in (3) would change to $>$. Now perturb $f$ "in the opposite direction", so that (2) holds and in (1), the sign changes to $>$. Then the sign in (3) would change to $<$. Then, by continuity, (3) should be equality.