Chernoff, Markov and Chebyhev all give some upper bound for tail probabilities, e.g. Chebyshev gives us
$Pr[|X-E[X]| \geq t] \leq \frac{Var[X]}{t^2}$.
This is quite helpful, but what if I would like to know $f(E[X], Var[X])$ such that
$Pr[|X-E[X]| \geq t] \geq f(E[X], Var[X])$?
More context, if needed: $X$ is a random variable with expected value $E[X]$ and variance $Var[X]$ as well as Support $\mathcal{X}$. We define $$I:=[E[X]-t, E[X]+t]$$ $$I^c:=\mathcal{X} - I$$ I want to calculate the following expectation $E[g(X)]=\sum_{x \in \mathcal{X}} Pr[X=x] g(x)$. Now I'd like to put summands together, e.g. $$E[g(X)]=\sum_{x \in I}Pr[X=x] \cdot g(x)+\sum_{x \in I^c}Pr[X=x] \cdot g(x) \leq Pr[|X-E[X]| \leq t] \cdot max_{x\in I } g(x) + Pr[|X-E[X]| \geq t] \cdot max_{x\in I^c }g(x) \leq (1-f(E[X],Var[X]))\cdot max_{x\in I } g(x) + \frac{Var[X]}{t^2}\cdot max_{x\in I^c }g(x)$$
I'm not sure that my calculation above makes sense, so if you could tell me whether it is correct or not, and alternatively give me a way to find an upper bound for $E[g(X)]$ based on $E[X]$ and $Var[X]$, that would be great.
It might be important that in general $g$ is NOT a linear function and is neither convex nor concave.
Assuming without loss of generality that $E(X)=0$ and $\mathrm{var}(X)=1$, the question seems to be whether there exists some positive function $u$, independent of the distribution of $X$, such that, for every $X$ such that $E(X)=0$ and $\mathrm{var}(X)=1$, and for every $t\gt0$, $$P(|X|\geqslant t)\geqslant u(t). $$ First, if $t\gt1$, $[|X|\geqslant t]$ may be empty hence no such $u(t)$ exists when $t\gt1$. But even when $t\lt1$, no such function $u$ can exist without some further restrictions on the distribution of $X$, as the following example shows.
Assume that, for some $s\lt1$ and some $x(s)\gt1$, $$P(X=s)=P(X=-s)=\frac12(1-s),$$ and $$P(X=\sqrt{x(s)})=P(X=-\sqrt{x(s)})=\frac12s.$$ Then $X$ is centered and its variance is $(1-s)s^2+sx(s)$ hence the choice $$x(s)=\frac1s-s(1-s)$$ ensures that $\mathrm{var}(X)=1$. Let $t\gt0$. For every $s$ small enough, $[|X|\geqslant t]=[X=\sqrt{x(s)}]$, which has probability $s$. Since $s$ can be made as small as desired, this contradicts the existence of some $u(t)\gt0$ such that the lower bound holds.
On the other hand, if $X$ is centered with variance $1$ and if $|X|\leqslant C$ for some $C\geqslant1$, then, for every $t\geqslant0$, $$ E(X^2)\leqslant t^2+C^2P(|X|\geqslant t), $$ hence, for every $t$ in $(0,1]$, $$ P(|X|\geqslant t)\geqslant\frac{1-t^2}{C^2}.$$