Upper bound on expected distance between two i.i.d. random elements of a metric space?

104 Views Asked by At

Suppose, $M$ is a metric space with metrics $d$, $X$ and $Y$ are i.i.d. random elements. For a random element $X$ define $supp(X) = \bigcap_{P(X \in A) = 1} A$. Suppose $supp(X)$ is bounded. Is it always true, that $E(d(X, Y)) \leq \frac{diam(supp(X))}{2}$?

For $\mathbb{R}$ with metrics $d(x, y) = |x-y|$ this is indeed true, as for any i.i.d. random variables $X$ and $Y$, such that $P(X \in [0; 1]) = 1$, by Holder inequality $E|X - Y| \leq E|X - Y|^2 = 2Var(X) \leq \frac{1}{2}$. However, I do not know, how to prove this for an arbitrary metric space.

EDIT: My proof for $\mathbb{R}$ is false. Currently we do not know even that.

1

There are 1 best solutions below

0
On BEST ANSWER

In a general metric space, this is certainly wrong. Consider a discrete metric space on n points (so $d(x,y) = 1$ whenever $x\neq y$), then the expected distance is $\frac{n-1}n$ (not exactly sure about that, but with an increasing number of points it has to go to 1) while the diameter is $1$.

For $\mathbb{R}^n$, I think you only have to look at balls, since the left side does not decrease if you replace $\operatorname{supp}(X)$ by a ball of its diameter. For this, maybe https://pdfs.semanticscholar.org/7680/3e21c4cc2d7bfabd6ccdf3fb30bc76fb75bd.pdf will help.

Edit: increase -> decrease