Entropy conservation of binary functions

59 Views Asked by At

In the merge function from https://github.com/ifdefelse/ProgPOW the authors talk about entropy maintaining functions that operate on bit-strings.

Can someone explain what this is? I found literally nothing on the Internet so fare.

1

There are 1 best solutions below

0
On BEST ANSWER

As in my comments:

Entropy maintaining

I assume that "entropy maintaining" refers to a map $(X_1, ..., X_n)\rightarrow (Y_1, ..., Y_n)$ with the property that: $$H(X_1, ..., X_n) = H(Y_1, ..., Y_n)$$

Definition of entropy $H(X_1, ..., X_n)$

Let $B$ be the set of all $n$ dimensional binary vectors. Define the probability mass function: $$p(x_1, ..., x_n) = P[(X_1,...,X_n)=(x_1,...,x_n)] \quad \forall (x_1,...x_n)\in B$$ Then $$H(X_1, ..., X_n) = -\sum_{(x_1, ..., x_n) \in B} p(x_1, ..., x_n)\log(p(x_1, ..., x_n))$$ If the log is a natural log, the entropy is in units of nats. If it is a base-2 log it is in units of bits. If $(X_1, ..., X_n)$ are i.i.d. Bernoulli-$(1/2)$ variables then $$ H(X_1, ..., X_n) = n \quad (bits)$$

You can also see here: https://en.wikipedia.org/wiki/Entropy_(information_theory)