Suppose $X_1$ and $X_2$ are iid from an arbitrary distribution with variance $\sigma^2$. How can we derive an upper bound for: $$P(|X_1-X_2|\ge\delta)$$
One simple idea is Chebyshev's Inequality, however, when $\delta<\sqrt{2}\sigma$, we have $$P(|X_1-X_2|\ge\delta)\le\frac{var(X_1-X_2)}{\delta^2}=\frac{2\sigma^2}{\delta^2},$$ which literally tells nothing. Clearly, this inequality can be improved in this case.
How can we get a better bound? Thank you.
If $2X_1 \sim \text{Bernoulli}(p)$ then $X_1 - X_2$ takes values $-1,0,1$ with probabilities $p(1-p), p^2 + (1-p)^2, p(1-p)$. Then we have $P(|X_1 - X_2| \ge 1) = 2p(1-p)$ which is precisely the Chebychev bound. Scaling this example will give you a similar example for a different choice of $\delta$.