Define the probability density and cumulative probability of the standard Gaussian: $$ f(t) =\frac{1}{\sqrt{2\pi}} e^{-t^2/2}, \text{erf}(x) = \int_{-\infty}^x f(t) dt. $$
How can I prove that the following ratio satisfy the bound below? $$ \frac{1-\text{erf}(x)}{f(x)}=\frac{1}{\sqrt{2\pi}}e^{-x^2/2}(1-\text{erf}(x)) \geq x^{-1}-x^{-3}. $$
A lower bound not easy to find. It is always easy to bound fast-decaying functions like $f$ from above, for example with something like $e^{-x(t-x)},$ to deduce $\frac{1-\text{erf}(x)}{f(x)} < x^{-1},$ but functions dacaying even faster than $f$ are usually hard to work with. It looks like as if these are the first two terms of a series expansion, and after some search, I do find a Laurent series of this kind here, but the coefficients does not match. So this problem should be motivated by something else.
In slightly different from this is Theorem 1.2.6 of Durrett, Probability: Theory and Examples, 5th edition. (This is probably elsewhere as well, but that's where I know it from.) Durrett observes that
$$ \int_x^\infty (1 - 3y^{-4}) \exp(-y^2/2) \: dy = (x^{-1} - x^{-3}) \exp(-x^2/2)$$
(which can be checked by differentiation). The left-hand side is obviously less than $\int_x^\infty e^{-y^2/2} \: dy$ and so you have
$$ \int_x^\infty e^{-y^2/2} \: dy \ge (x^{-1} - x^{-3}) \exp(-x^2/2) $$
The LHS is $1 - erf(x)$ and rearranging gives the bound you want.