Consider a real-valued random variable $X$. I want a necessary condition for the following function:
$$ t\longrightarrow\max_S\left(\max\left(\frac{P[X\in S]}{P[X+t\in S ]},\frac{P[X+t\in S]}{P[X\in S]}\right)\right) $$
to be nondecreasing on $\mathbb{R}^+$.
Intuitively, I consider two distributions "similar" when the probability of one event in one is multiplicatively bounded by the probability of an event in the other, and the larger multiplicative factor is a measure of similarity. I want to a necessary condition for a probability distribution such as the more you translate it, the "more different" it becomes to its original version.
I'm looking for a characterization of probability distributions which verify the above. I conjecture that if the PDF is continuous, and is increasing and then decreasing (i.e. there is only one "bump"), then the condition above is satisfied. It works with Gaussian and Laplacian distributions, but I can't prove it in the general case. Is this conjecture true, and otherwise, what's a counterexample and an alternative condition?