Functions that behave "nicely" under summation in expectation

32 Views Asked by At

Let $(X_t)_{t=1}^\infty$ by a sequence of i.i.d. random variables and $(Y_t)_{t=1}^\infty$ be an independent sequence of i.i.d. random variables.

For what functions $f$ is the following theorem true?

If for every $t$:

$$ E[f(Y_t)] > E[f(X_t)] $$

Then for a sufficiently large $T$:

$$ E[f(\sum_{t=1}^T{Y_t})] > E[f(1+\sum_{t=1}^T{X_t})] $$

A sufficient condition is that $f$ is a linear function, $f(x)=ax+b$.

PROOF: Let $\Delta = E[f(Y_t)] - E[f(X_t)] = a(E[Y_t]-E[X_t])$.

By our assumption, $\Delta>0$. Now:

$$ E[f(\sum_{t=1}^T{Y_t})] - E[f(1+\sum_{t=1}^T{X_t})] = $$ $$ = a \sum_{t=1}^T (E[Y_t] - E[X_t]) - (a+b) = $$ $$ = T\Delta - (a+b)$$

and the claim is satisfied for $T > (a+b)/\Delta$.


It is also true when $f$ is an exponential function, e.g. $f(x)=2^x$.

PROOF: Let $\Delta = E[f(Y_t)] / E[f(X_t)] = E[2^{Y_t}]/E[2^{X_t}]$.

By our assumption, $\Delta>1$. Now:

$$ E[f(\sum_{t=1}^T{Y_t})] / E[f(1+\sum_{t=1}^T{X_t})] = $$ $$ = E[\prod_{t=1}^T (f(E[Y_t]) / f(E[X_t])] / 2 = $$ $$ = \Delta^T / 2$$

and the claim is satisfied for $T > \ln{2}/\ln{\Delta}$.

Probably there are many more such functions. For what other functions $f$ is the theorem correct?