Note first that this is an unstructured question, indicating that every answer or comment which inspires better formulation of the question would be always useful. The questions are of the following two parts.
(1) Say we have a pivot distribution $Q$ whose support is $S_{K-1}$, the set of $K$-dimensional probability vectors $\{(x_1, ..., x_K)\}$ with $0 \leq x_k \leq 1$ and $\sum_{k=1}^K x_k = 1$.
I would like to construct a generic probability mass function $f$ that
- Assigns smaller masses to distributions: $P$'s (with the same support $S_{K-1}$) as $dist(P, Q)$ is larger.
- Assigns larger masses to distributions: $P$'s (with the same support $S_{K-1}$) as $dist(P, Q)$ is smaller.
- It would be great if $f$ would be a legal density (rather than a unnormalized positive potential).
In other words, I am searching for whether there is a natural way to construct $f$ which is inversely proportional to the distance to the pivot distribution $Q$, satisfying
$ \int_{P \in S_{K-1}} f(dist(P, Q))dP = 1. $
(2) Assuming there is a very natural probability mass function $f$ that satisfies (1), is it possible to compare the probabilities/likelihoods computed from two different supports? (Say $P_1, Q_1 \in S_k$ and $P_2, Q_2 \in S_l$ with $k \neq l$. If $f(dist(P_1, Q_1)) > f(dist(P_2, Q_2))$, then can we say $P_1$ is more probable w.r.t $Q_1$ than $P_2$ w.r.t $Q_2$ even if they are of different supports?
This question could be asking when we compute some distributional distances (e.g., KL-divergence), is the value scale free or scale dependent? Can we compare the distances achieved from the different supports? Is there any scale-invariant something that measures the distributional distance as a directly comparable quantity?