How to express less information than a uniform probability distribution function?

60 Views Asked by At

Suppose you have a discrete finite set of possible outcomes, say $x_i,\,i=1\ldots n$. And "information" here means what you can say about which outcome will occur. That would typically be represented by some probability distribution function, say $\rho(i)$, with the usual constraints. And then the least information (maximum entropy) you could have would be represented by the uniform distribution $\forall i: \rho(i)=\frac1n$.

But suppose you don't even know that, i.e., you don't even know the pdf characterizing the $x_i$ outcomes. You might suggest a pdf over the space of pdfs, but such a "second-order pdf" still reduces to a pdf. So how can you formally/rigorously express less information than a uniform distribution, where you don't even know the distribution to begin with? Because knowing the distribution is still knowing something. So, also, how would you represent "no information", what would it mathematically mean?

1

There are 1 best solutions below

8
On BEST ANSWER

Make a Partition of the Universe , that is , make Sets $X_i$ where every Pair will have no common element , with the Union being the Original Set $Y$.

$\Cup X_i = Y$

Now assign Probability of a Set $P(X_i) = |X_i|/|Y|$

This is very Close to uniform Distribution but has less information because we are not assigning Probability to each element.

In Extreme Case, make $X_1$ & $X_2$ have equal elements(or maybe $X_1$ has 1 more than $X_2$) where $X_1 \cup X_2 = Y$

$P(X_1) \approx P(X_2) \approx 1/2$
This will have very less information.

Eventually , make it $P(Y)=1$ which has no information !