I have two unidimensional functions, $x^a$ ans $x^b$, with $a>0$ and $b>0$, with the sole difference than one is more concave than the other ($a \neq b$). Is there an inequality/theorem relating the average of these two functions, stating that one average is greater than the other?
My actual problem is in the following graph:
Since $1>\alpha>0$, $f_{L}$ is convex, and $f_{Y}$ is less convex. The latter can even be linear ($\alpha=0.5$) or concave ($\alpha<0.5$).
I am looking for an inequality/theorem which proves that the average of the same two points projected into the functions (the y-axis level where $p$ crosses the red lines) are systematically different. In particular, that $\bar{f_{L}}>\bar{f_{Y}}$, as the graph shows.
I have searched for this inequality, but I cannot see one.

One inequality that can be derived is $\bar{f_{L}}^\alpha \ge \bar{f_{Y}}\,$, which follows from the generalized mean inequality. Better bounds will likely require additional information on the values and ranges.
[ EDIT ] For the particular case shown in the figure, the stronger inequality $\bar{f_{L}} \gt \bar{f_{Y}}\,$ holds true. Let $q=\frac{\Delta}{p} \in (0,1)$ and $a=\frac{\alpha}{1-\alpha} \gt 0$ so that $\frac{1}{1-\alpha}=a+1$. Then:
$$ \require{cancel} \begin{align} 2\,(\bar{f_{L}} - \bar{f_{Y}}) & = \left((1-q)^{a+1}+(1+q)^{a+1}-(1-q)^{a}-(1+q)^{a}\right) \\ & = \left((1-q)^a(\bcancel{1}-q-\bcancel{1}) + (1+q)^a(\bcancel{1}+q-\bcancel{1})\right) \\ & = q \left((1+q)^a-(1-q)^a\right) \\ & \gt 0 \end{align} $$
The latter inequality follows from the monotonicity in $x$ of $x^a$ for $x,a \gt 0\,$. The above does not use convexity arguments, so it holds for $\alpha \le 0.5$ as well.