Let $X$ be a random variable taking values on $\mathbb{R}^N$, with probability density function $f:\mathbb{R}^N\rightarrow \mathbb{R}_{\ge 0}$.
Consider a new random variable $Y$ taking values in the countable set $(y_n)_{n\in\mathbb{N}}\subseteq \mathbb{R}^N$, with respective probabilities $(p_n)_{n\in\mathbb{N}}$, so $P(Y=y_n)=p_n$ and $\sum_{n=0}^\infty{p_n}=1$. Without loss of generality, we may assume that $p_n$ is weakly decreasing in $n$.
By choosing $y_n$ and $p_n$ appropriately, can we ensure (non-trivial) error bounds of the something vaguely like one the following forms are satisfied?
$$\forall x\in\mathbb{R}^N\setminus\{y_n|n\in\mathbb{N}\},\,\forall \epsilon>0,\,|P(\|X-x\|_2<\epsilon)-P(\|Y-x\|_2<\epsilon)|\le C \epsilon^{-\kappa},$$
for some $C>0$ and $\kappa>0$, possibly depending on $N$.
AND/OR
$$\forall x\in\mathbb{R}^N\setminus\{y_n|n\in\mathbb{N}\},\,\forall \epsilon>0,\,|P(\|X-x\|_2<\epsilon)-P(\|Y-x\|_2<\epsilon)|\le C P(\|X-x\|_2<\epsilon),$$
for some $C\in [0,1)$,
AND/OR
$$\forall x\in\mathbb{R}^N\setminus\{y_n|n\in\mathbb{N}\},\,\lim_{\epsilon\rightarrow 0}{\frac{P(\|X-x\|_2<\epsilon)}{P(\|Y-x\|_2<\epsilon)}}=1.$$
I am not particularly attached to the precise form of these results. I am just curious if having countably many points means you can get in some sense "close" to the true, in a way that you can never with finitely many (as with finitely many, sufficiently small balls will contain $0$ $y_n$).
I.e. is there any sense in which you can do better with countably many support points than you can with finitely many?
For example, suppose we take $p_n={(n+1)}^{-1-\theta}{[\sum_{k=0}^\infty{{(k+1)}^{-1-\theta}}]}^{-1}$ for some small $\theta>0$. Then it seems plausible that there could be an algorithm for constructing $y_n$ that proceeded by drawing points from $X$ and discarding them if they are too close to previously drawn points.
My worry though is that the early points will end up with a kind of "rain shadow" around them. Sets close to $y_1$, but not containing $y_1$ will end up with lower mass than they should have.
Can this error be bounded?
Consider $X$ to be a standard normal distribution (any continuous distribution will work). Also, let $Y$ be a random variable taking values in a countable set, say $S$. Furthermore $P(Y \in S)=1$ by definition and $P(X \in S)=0$. This shows that for every approximating discrete random variable there will be a Borel set $S$, where the approximation will fail.