An integral inequality related to the normal distribution

84 Views Asked by At

Through numerical experiments, I conjecture $$I=\int_0^\infty \big(\big|1-e^{\sigma(-x_1+x)}\big|-(1-e^{-\sigma x_1})\big)e^{-\frac{(x_1+x)^2}2}dx>0,$$ $\forall x_1>0,\, \sigma>0$. Is this true?

1

There are 1 best solutions below

5
On BEST ANSWER

This inequality is false. Indeed, as was noted in comments by Greg Martin and you, $I$ can be expressed in terms of the error function. Specifically, we have $$I=\sqrt{\frac{\pi }{2}} e^{-\sigma x_1} \left(2 e^{\frac{1}{2} \sigma \left(\sigma -2 x_1\right)} \text{erf}\left(\frac{\sigma -2 x_1}{\sqrt{2}}\right)+e^{\frac{1}{2} \sigma \left(\sigma -2 x_1\right)} \text{erf}\left(\frac{x_1-\sigma }{\sqrt{2}}\right)-\text{erf}\left(\frac{x_1}{\sqrt{2}}\right)+\text{erf}\left(\sqrt{2} x_1\right)-2 \text{erfc}\left(\sqrt{2} x_1\right) e^{\sigma x_1}+\text{erfc}\left(\sqrt{2} x_1\right)+e^{\frac{1}{2} \sigma \left(\sigma -2 x_1\right)}\right). $$ Now we can see that $I=-0.07296807214284069917\ldots<0$ for $x_1=9/10=\sigma$.

Here is the corresponding Mathematica notebook:

enter image description here