I am reading a book about non-parametric statistics (Tsybakov's Introduction to Non-Parametic Estimation), and in order to prove some important inequalities on mean-squared error, different classes of sufficiently smooth probability measures are introduced. Not much is proven about the classes of probability measures in general, and I wanted to know whether much is known when they are considered as metric spaces or topological spaces.
For example, given two positive real numbers $\beta, L > 0$, Tsybakov defines the ${\bf Nikol'ski ~ class}$ ${\cal H}(\beta, L)$ to be the set of functions $f:\mathbb{R} \rightarrow \mathbb{R}$ such that the derivative $f^{(l)}$ of order $l = \lfloor \beta \rfloor$ exists and satisfies
$\left[ \int \left( f^{(l)}(x+t) - f^{(l)}(x) \right)^2 dx \right]^{\frac{1}{2}} \leq L|t|^{\beta - l}$ for all $t \in \mathbb{R}$.
Let ${\cal P}(\beta, L) = \{p \in {\cal H}(\beta, L): \int p(x)dx = 1 \}$ be the set of probability measures in the Nikol'ski class.
Suppose ${\cal P}(\beta, L)$ is considered as a metric space under the $L_2$ distance.
- Is it the case that ${\cal P}(\beta, L)$ has no isolated points?
- Are open balls in ${\cal P}(\beta, L)$ connected?
- Is ${\cal P}(\beta, L)$ closed under convex combinations of measures?
- Is ${\cal P}(\beta, L)$ complete?
I suspect the answer to the first two questions are "yes", but I don't know how to prove it if so. For the first question, it strikes me that one could simply "shift" a probability measure to find another close measure (e.g., given a measure $p$, define $q(x)=p(x+\epsilon)$ for some small enough $\epsilon$). The third also initially struck me as true, but when I tried to bound the convex combination of two measures in the obvious way, I got stuck.