Consider the probability distribution $p(x)$. I am wondering if there is a general notion of uncertainty for $p(x)$. To provide some context, I am interested in ordering distributions based on uncertainty (i.e., $f(p(x)) \leq f(p(y))$ would imply the uncertainty in $p(x)$ is not greater than the uncertainty in $p(y)$) where $f$ is a mapping from a distribution to a real number that is a metric for quantifying uncertainty.
This seems difficult in general as uncertainty is ambiguous. Here is a simple example.
Let $p(x)$ and $p(y)$ be Gaussian, so $p(x) = \mathcal{N}(x|\mu_x,\sigma_x^2)$ and $p(y) = \mathcal{N}(y|\mu_y,\sigma_y^2)$. So, in this scenario, if I have $\sigma_x^2 \leq \sigma_y^2$, then I would consider $p(x)$ to have less uncertainty than $p(y)$.
However, even if extended to a multi-variate Gaussian (e.g., $p(\mathbf{x}| \mathbf{\mu}_x, \mathbf{\Sigma}_x)$ and $p(\mathbf{y}| \mathbf{\mu}_y, \mathbf{\Sigma}_y)$), the quantification is ambiguous as an ordering is required for covariance matrices (e.g., trace, determinant, or maximum eigenvalue are often used).
So, is there a general notion of uncertainty for probability distributions?
For general distributions a common measure of “uncertainty” comes from Information Theory: the Maximum Entropy Distribution
This measure has the advantage of a solid theoretical justification and produces distributions that make intuitive sense as “less informative”.