Suppose I have two different discrete random variables $y>0$ and $x>0$. Now I want to compare two expected values involving these and a nonlinear transformation: When is one larger than the other, i.e., when is $$E\left[y^\alpha\right]>E\left[x^\alpha\right]?$$ We can assume $0<\alpha<1$ (making the transformation concave). Now what condition on the distributions of these random variables are necessary or sufficient for this inequality to hold?
Two cases that have a simple answer:
For $\alpha=1$ there is no nonlinearity, so the matter is quite easy and simply boils down to which random variable has a larger mean.
For another nonlinear transformation of the random variables, $E[y-y^2]$ and $E[x-x^2]$, the question can be reduced to a sum of mean and variance of the random variables, since $E[y-y^2]=E[y]-(Var(y)+E[y]^2)$.
Since the above transformation $E\left[y^\alpha\right]$, $0<\alpha<1$ is concave, there should be some condition involving some measure of mean and variance (and possibly higher order moments) as well. However, within the class of concave transformations, the quadratic case seems to be unique in that it gives such a simple condition on the distributions (only involving the mean and variance of the random variables). Or is there a similarly simple condition?
(I am adding the tag "economics" since the problem is similar to an expected utility problem comparing two gambles.)
$\newcommand{\E}{\mathbb{E}}$$\newcommand{\P}{\mathbb{P}}$For positive values, $\psi(x)=x^{\alpha}$ is monotone (I think even for all $\alpha \in \mathbb{R}$, but certainly for $0 < \alpha < 1$).
Therefore for such $\psi$ we have
by monotonicity of expectation and monotonicity of $\psi$.
If we want to find a counterexample, we need to find either:
$X$ and $Y$ which are not comparable in the partial order of random variables, such that $\E[X] > \E[Y]$.
$X$ and $Y$ such that $X \le Y$ and $\E[X]>\E[Y]$ (which is impossible by the monotonicity of expectation).
Let $X$ be an R.V. such that $\P(X=1)=\frac{1}{2}$ and $\P(X=2)=\frac{1}{2}$. Then $\E[X]=\frac{3}{2}$.
Let $Y$ be an R.V. such that $\P(Y=\frac{5}{4})=1$. Then $\E[Y]=\frac{5}{4}$.
Let $Z$ be an R.V. such that $\P(Z=\frac{149}{100}=1.49)=1$. Then $\E[Z]=1.49$.
In the partial order of random variables, we have that $X$ and $Y$ are incomparable, but nevertheless $\E[X] > \E[Y]$. Also $\E[X]$ and $\E[Z]$ are incomparable, but $\E[X] > \E[Z]$.
Let $\alpha = \frac{1}{2}$.
Then $\E[X^{\alpha}]= 0.5 \cdot 1 + 0.5 \cdot \sqrt{2} \approx 1.20710678119$.
$\E[Y^{\alpha}]=\sqrt{1.25} \approx 1.11803398875$.
Finally, $\E[Z^{\alpha}]=\sqrt{1.49} \approx 1.22065556157$.
Thus, in summary, for the same $0<\alpha<1$, we found:
as well as
So we can't say much in general unless $X > Y$ almost surely.
The concave version of Jensen's inequality doesn't help much more either, because even if you know that $\E[X] \ge \E[Y]$, all you can conclude is that
using the concave form of Jensen's inequality and the monotonicity of $\E$ and $\psi$. But this doesn't tell you at all how $\E[\psi(Y)]$ and $\E[\psi(X)]$ compare to each other, unfortunately, which is the information you are interested in.
In fact, $X$, $Y$, and $Z$ as defined above are again examples of how, even when these relations are true, both $\E[\psi(X)]>\E[\psi(Y)]$ and $\E[\psi(X)]< \E[\psi(Z)]$ are possible, even with $\E[X]>\E[Y]$ and $\E[X] > \E[Z]$ being true (as well as the corresponding inequalities following from Jensen's inequality and monotonicity of $\E$ and $\psi$ being true).