Upper bound on the expectation $E[(X-t)^+]$

160 Views Asked by At

Let $X$ be a random variable with $E[X]=0$ and $E[X^2]=1$ that satisfies \begin{equation} |F_X(x) - \Phi(x)| \leq \alpha, \qquad \forall x\in\mathbb{R}, \end{equation} where $F_X(\cdot)$ is the cdf and $\Phi(x) = \frac{1}{\sqrt{2\pi}}\int_{-\infty}^x e^{-y^2/2} \mathrm{d}y$.

I am looking for an upper bound on $E[ (X-t)^+]$ in terms of $\alpha$ and $t$.

  • Neglecting the constraint $|F_X(x) - \Phi(x)| \leq \alpha$, an upper bound can be found as following using Jensen's inequality \begin{align} E[t-X + 2 (X-t)^+]^2 = E[|X-t|]^2\leq E[(X-t)^2] = 1 + t^2 \end{align} from which I obtain \begin{align} E[(X-t)^+] \leq \frac{1}{2}\left( \sqrt{1 + t^2} -t\right). \end{align}

  • However, when $\alpha$ is small, $X$ is approximately Gaussian distributed, and hence I expect $E[(X-t)^+]$ to be bounded as \begin{align} E[(X-t)^+] \leq \int_{t}^{\infty} \phi(x) (x-t) \mathrm{d}x + \epsilon(\alpha), \end{align} where $\phi(x) = \frac{1}{\sqrt{2\pi}} e^{-x^2/2} $. Numerical simulations suggests that $\epsilon(\cdot)$ can be chosen as $\epsilon(x)=c \cdot x$ for some positive constant $c$.

  • I have attempted to upper bound $E[(X-t)^+]$ in terms of the cdf as following \begin{align} E[(X-t)^+] = \int_{t}^{\infty} (1-F_X(x))\mathrm{d}x \leq \int_{t}^{\infty} (1-\Phi(x)+\alpha )\mathrm{d}x = \int_{t}^{\infty} \phi(x) (x-t) \mathrm{d}x + \int_t^\infty \alpha \mathrm{d}x, \end{align} but then the last integral is divergent which suggests that the second moment must be used.