$\min$-entropy for the uniform distribution on $[]$

100 Views Asked by At

The min-entropy of a distribution $\nu$ on $[n]$ is given as: $$H_{\infty}(\nu)=\min_{i} \log(\frac{1}{\nu(i)})$$

Now we will prove that that for every distribution $\nu$ on $[n]$ and for $U$ being the uniform distribution on $[n]$, then we have that: $$H_{\infty}(\nu) \leq H_{\infty}(U)$$ My thought us that the distribution of $U$ is $\frac{1}{n}$, then we get that $\log(\frac{1}{(\frac{1}{n})_i})=(n)_i$, has minimum in the first element so $H_{\infty}(U)=1?$ But the lowest element must be 1 because $i\geq 1$, therefore $H_{\infty}(\nu) \leq H_{\infty}(U)$? I'm not sure if I have understood it correct? Something says me that is not correct, but I don't think I understand what the i means. Can anyone help me?

1

There are 1 best solutions below

4
On

I am guessing $[n] = [1,n] \cap \mathbf{N}.$ Also $H_\infty(U) = \log n$ and $\nu$ is discrete. If $\nu$ is discrete, and it is different from the uniform distribution, then $\nu(i) > \frac{1}{n}$ for some $i,$ as the opposite assumption will yield $\nu([n]) < 1.$ (To see this, observe that if $\nu(i) \leq \frac{1}{n}$ for all $i,$ and $\nu$ is not the uniform distribution, then $\nu(i_0) < \frac{1}{n}$ for some $i_0,$ and then $\nu([n]) = \sum_i \nu(i) = \nu(i_0) + \sum_{i \neq i_0} \nu(i) < \frac{1}{n} + \frac{n-1}{n} = 1,$ contradicting the fact $\nu$ is a probability mass function.) Then $H_\infty(\nu) < \log(n).$