Calculating the information entropy of a discrete signal on a calculator

168 Views Asked by At

In the last few days I've found myself having to calculate the entropy of a discrete symbol source (which emits one symbol chosen among a finite set with corresponding probabilities). The formula is straightforward but relatively lengthy to type in a calculator: $$ \sum_{n \in S}^{} -p_n \log_{2}(p_n) $$ Where $S$ is the set of all the symbols and $p_n$ is the probability of occurrence of symbol $n \in S$.
I have a Casio fx-570es plus, a common non-programmable scientific calculator, which requires me to manually expand the sum like this (let's say that the probabilities of $S$ are $[0.5,0.3,0.2]$):
$$ -0.5 \log_2{0.5} -0.3 \log_2{0.3} -0.2 \log_2{0.2} $$ This approach scales badly over more lengthy sequences and has the risk of mistyping a number.
Does my calculator possess any functionality that can express this formula in a more concise way?