Entropy of a mealy machine

215 Views Asked by At

I have a finite state machine with binary input, and I map each state non-injectively to letters of an output alphabet, here also binary.

Assuming the binary input is uniformly random, how much information is still in the output?

More precisely, if $X_n$ is the random variable that contains the output of the machine at step $n$, what is $\lim_{n\to\infty} \frac{H(X_0,\ldots,X_n)}{n}$?

Simple example:

States: A, B, C, D, E Transitions:

  | 0 | 1
A | B | C
B | B | C
C | A | E
D | A | E
E | A | D

Outputs:

A | 1
B | 0
C | 0
D | 1
E | 0

Example: (initial state B)

Input:   011011100100110
States:  BCEACEDABCABCEA
Output:  000100110010001
1

There are 1 best solutions below

0
On

In my concrete application I later learned that I am really interested in the collision entropy (or Rený entropy), not the usual Shannon entropy.

For that case, I solved this problem by converting the Mealy automaton to a Markov chain (see section 2.2 of https://eprint.iacr.org/2018/1163 for details).

For these I could then calculate the Rený entropy with the method published as http://dx.doi.org/10.1109/ISIT44484.2020.9174390 (also available at https://arxiv.org/abs/1709.09699).