Let $Z\sim N(0, 1)$ then $E(f(Z))\le E(f(X))$ for any convex $f$ and any $X$ with unit variance.

58 Views Asked by At

I thought about this question purely out of curiosity. I have a feeling that it is true, but I can't prove it.

Standard normal distribution arises if we maximize the entropy among all absolutely continuous probability measures with mean $0$ and variance $1.$ But I want to show that $Z\preceq_{cx} X.$ That is, Suppose $f$ is a convex function and $Z\sim N(0, 1)$ and $X$ is some (absolutely continuous) random variable with zero mean and variance $1.$ I want to show that $$E(f(Z))\le E(f(X)),$$ provided both sides are finite.

1

There are 1 best solutions below

0
On BEST ANSWER

Take the random variables $X_k$ with densities $$ f_k(x) = \mathbb{1}_{(\alpha,+\infty)}(|x|)c|x|^{-k} $$ where $$ k>3,\qquad \alpha^2 = \frac{k-3}{k-1}, \qquad 2c = \frac{k-3}{\alpha^{3-k}}. $$ They have mean 0 and variance 1 for every $k$, and their fourth moment (when $k>5$) is finite and it is $$ \frac{(k-3)^2}{(k-3)^2-4} $$ that is as large as you want when you take $k=5+\epsilon$.