Does the min-entropy $H_{\rm min}(X)\equiv \min_x\log(1/p_x)$ of a source $X$ have an operational interpretation?

49 Views Asked by At

This is a more specific version of this other related question of mine. Going for example with the notation used in (Renner 2006), min- and max-entropies of a source $X$ with probability distribution $P_X$ are defined as $$H_{\rm max}(X) \equiv \log|\{x : \,\, P_X(x)>0\}| = \log|\operatorname{supp}(P_X)|, \\ H_{\rm min}(X) \equiv \min_x \log\left(\frac{1}{P_X(x)}\right) = -\log \max_x P_X(x).$$

As mentioned in the comments of the linked post, $H_{\rm max}(X)$ can be interpreted as the optimal bound for compressibility in the single-shot regime.

Is there any similar kind of operational interpretation for the min entropy $H_{\rm min}(X)$? Be it in terms of single-shot compressibility, or something else? I haven't found something like this mentioned directly in the relevant literature, but I might have missed it.

1

There are 1 best solutions below

3
On

Well $P_{max}=\max_{x} P(x)$ is the probability that an optimal guessor given a single guess for the discrete random variable with pmf $(P(x))_{x \in A}$ succeeds. The log measures the number of bits of information obtained in that scenario.

This is classical, also see Wikipedia:

Claude Shannon's definition of self-information was chosen to meet several axioms:

  • An event with probability 100% is perfectly unsurprising and yields no information.
  • The less probable an event is, the more surprising it is and the more information it yields.
  • If two independent events are measured separately, the total amount of information is the sum of the self-informations of the individual events.

It can be shown that there is a unique function of probability that meets these three axioms, namely $\log(1/ P(x)).$