Given a random variable $X>0$ with Moment Generating Function $m(s)=E[e^{sX}]$ I'm interested in finding a lower bound $$\Pr[X \ge t] \ge 1-\varepsilon(t),$$ where $t>E[X]$.
A classic technique for finding upper bounds for $\Pr[X \ge t]$ is using Markov's inequality with the Moment Generating Function: $$\Pr[X \ge t] \le E[e^{sX}]e^{-st},$$ for $s\ge 0$. Since $E[e^{s X}]\ge e^{s E[X]}$ (Jensen's inequality), this bound is useful whenever $t>E[X]$. (I'm assuming $X$ is a positive random variable, so $E[X]>0$.)
Standard techniques for lower bounds include the second-moment method, such as Cantelli's inequality and Paley-Zygmund. However, they only cover the case $t\in[0,E[X]]$ and say nothing about the case $t>E[X]$.
My best bet so far has been Lévy's and Gil-Pelaez's inversion formulas for the Characteristic Function $m(i s)$:
$$\begin{align}\Pr[X \ge t] &= 1-\frac{1} {2\pi} \lim_{S \to \infty} \int_{-S}^{S} \frac{1 - e^{-ist}} {is}\, m(is)\, ds. \\\text{and}\quad \Pr[X \ge t] &= \frac{1}{2} + \frac{1}{\pi}\int_0^\infty \frac{\operatorname{Im}[e^{-ist}m(i s)]}{s}\,ds.\end{align}$$
However, I don't have any good ideas for bounding those in a useful way. It certainly seems a lot harder than the Markov method for the upper bound.
Am I missing a useful approach here?