How to measure the "entropy" of a binary array over time

104 Views Asked by At

Suppose the following arrays: a = [0, 1, 0, 1, 0, 0, 1, 0, 1, 1, 1, 1], b = [1, 1, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0]. The occurrence of 0's in a is 5/12 while in b = 6/12. Although similar probabilities, the 0's occurrences in $b$ are more spread compared to $a$. If my definition of entropy is correct, $b$ has higher entropy than $a$. I'm looking for a measure that captures this information over time, i.e., how the entropy of $a$ and $b$ evolves.

My naive first intuition leads me to a probability measure using a sliding window of size > number of elements. In a binary array, size must be > 2. Let's see an example using sliding windows of size 4:

Thus for $a$ we have: a = [[0, 1, 0, 1], [0, 0, 1, 0], [1, 1, 1, 1]]

Taking the complement of $max(element)/total$ between 0 and 1 we have $o_a$ = [0.5 0.25 0] For example, in the slide windows [0, 0, 1, 0] the element zero has 3/4 occurrences = .75, thus subtracting from 1 we have = 0.25.

Similarly for $b$ = [[1, 1, 0, 1], [0, 0, 1, 1], [0, 0, 1, 0]]

Then $o_b$ = [0.25 0.5 0.25]

Is there any metric to perform similar math more elegantly?