Conditions on distributions to obey a certain inequality

18 Views Asked by At

Given a random variable X with (continuous, differentiable) CDF $F(x), x\geq 0$, I want to find conditions under which it satisfies ($\forall a,x\geq 0$): $$\frac{F(a+x)-F(a)}{x.f(a+x)}\leq\frac{1-F(a)}{1-F(a+x)}$$ The inequality holds for $X\sim Uniform[a,b]$, or $X\sim Exponential(\lambda)$ (easy to check by substituting). In simulations, it seems to hold for any $X\sim Gamma(\alpha,\beta)$, and also the Cauchy distribution.

Some geometrical intuition: if I consider 2 points $a$ and $a+x$ on a CDF $F(x)$, then the LHS is the ratio of how far the function F drops going backward from $a+x$ to $a$, while the denominator is the corresponding drop in the tangent to $F$ at $a+x$. The RHS is the ratio of how far the function $F$ is from $1$ at $a$ vs. $a+x$; note that its always $\geq 1$ as F is always non-decreasing. Now if $F$ is convex at $a+x$, then the inequality is clearly true (the LHS is $<1$ as the tangent drops by more than $F$). This also shows why it works for any uniform distribution.

Essentially I want to know if there is some more well-known condition on distributions that implies this (or if it's always true...).