A Conjecture on the Moment Generating Function of Functions of Sub-Gaussian Random Variables

192 Views Asked by At

By way of introduction, a standard normal random variable $X$ has MFG $E[\exp(tX)]=\exp(t^2/2)$. A symmetric subgaussian random variable $Y$ with unit proxy variance is one such that $P(Y< -t) = P(Y>t)$, and $P(|Y|>t)\le 2 \exp(-t^2/2)$ or, equivalently, $E[\exp(tY)]\le\exp(t^2/2)$. The conjecture is the following: let $f:\mathbb R\rightarrow \mathbb R$ be an even function, so that $f(X), f(Y)$ are also symmetrically distributed. The conjecture is that

$$E[e^{tf(Y)}]\le E[e^{tf(X)}]$$

It is true for the very special case where $f$ is the identity.

N.B.: I edited the question and required symmetry of $Y$ and $f$.

2

There are 2 best solutions below

7
On BEST ANSWER

Symmetry does not seem to be the right concept here.

For example, sample $Y$ uniformly in $\{-1, +1\}$ -- the typical example of an $O(1)$-subgaussian random variable -- and define $f(x)=1$ if $x=1$, $f(x)=-1$ if $x=-1$ and $f(x)=0$ elsewhere. Then $\mathop E e^{tf(Y)}=(e^t+e^{-t})/2$, but $\mathop E e^{tf(X)}=1$ since $f(X)=0$ almost surely. Your inequality is violated if you pick $t$ large enough.

Of course, the issue is not with $f$ being discontinuous, one could always approximate it by some continuous function and get the same result.

1
On

If I understand your question correctly, you are asking the following. Knowing that $Y$ has tail probabilities lighter than $X$ (meaning that $Y$ is $1$-subgaussian), is is true that for any function $f$, the random variable $f(Y)$ has tails lighter than that of $f(X)$? This cannot be true, since we can easily construct $f$ to make the tails of $f(Y)$ heavier.

Formally, take $Y=2$ (for instance) and $f(x)=1/(1+x)$. $Y$ is certainly $1$-subgaussian and $$\mathbf{P}\left[\frac{1}{1+Y}\geq x\right]=\mathbf{P}[1/3\geq x]=\mathbf{1}\{x\leq 1/3\}.$$ However, $$\mathbf{P}\left[\frac{1}{1+Y}\geq x\right]=\Phi\left(\frac{1}{x}-1\right).$$ In particular, for all $x\leq 1/3$ $$\mathbf{P}[f(Y)\geq x] = 1 > \mathbf{P}[f(Y)\geq x].$$ You can also compute the MGF of $X$ and $Y$ explicitly and check that $MGF_X(x)<MGF_Y(x)$ when $x$ is small and positive.

Something that is true in some cases is that $f(Y)$ is $\sigma^2$-subgaussian for some variance proxy. There exist some very nice results, especially when $Y$ is high-dimensional. See for example this page for the special case of the sum, or this one for a result allowing for general $f:\mathbf{R}^n\rightarrow \mathbf{R}$, although this is a vast theory that I don't know very well.