Distributions with equal Renyi Entropies

175 Views Asked by At

Suppose we have two distributions given by the vectors $p=(p_1,\dots,p_n)$ and $q=(q_1,\dots,q_n)$, with $p_i,q_i\geq 0$, and $\sum_i p_i = \sum_i q_i=1$.

Now suppose that for some $\alpha\in(0,\infty)$,

$$H_{\alpha}(p)=H_{\alpha}(q),$$

where $H_{\alpha}(p)=\frac{1}{1-\alpha}\log\sum_i p_i^{\alpha}$ is the Rényi entropy of $p$. What can we say about $p$ and $q$? Of course if $q$ is a permutation of $p$, then their entropies will be equal. But is the converse true?

1

There are 1 best solutions below

0
On

As pointed out in the comment, all you can say is that the ${\alpha}-$ norm of the two vectors are equal: $$ \sum_{i=1}^n p_i^{\alpha}=\sum_{i=1}^n q_i^{\alpha} $$ if and only if $$ \lVert p\rVert_{\alpha}=\lVert q\rVert_{\alpha}. $$ Of course their $1-$norm is one (e.g., $\sum_{i=1}^n p_i=1$) and their entries are nonnegative as well. So they all lie in the positive orthant of $\mathbb{R}^n$ on the unit sphere $S_{n-1}.$