Given an probability distribution $\mathcal{P}$ over points, that is centrally symmetric about the origin such that $\mathcal{P}(x) = \mathcal{P}(-x)$, does the expected distance of a random point, $x\in\mathcal{P}$, to a fixed point, say $Q$, on an arbitrarily chosen ray from the origin increase monotonically as the distance (along the ray) of Q from origin increases?
I want to use the above statement as part of a larger proof if true. The guarantee of monotonicity is important and I'm wondering if this is already a known result.
Thanks!
Edit: I corrected my earlier post where I mistakenly mentioned that the distance fell off, instead of increased monotonically.