Sum of Information Entropy from Conditional Events

57 Views Asked by At

I've got a problem regarding entropy of information and I'm stuck on the last part. So, there's a theoretical random number generator that spits out random bits of a requested length. However, with a certain probability, it returns a string of 1's in an entirely predictable manner.

The event that it properly works is A, B is a failure. So:

$$P(A)=p$$ $$P(B)=1-p$$

It's trivial to determine the entropy in the case that it works, for a key of $n$ bits:

$$H(K|A)=n$$

Now, since the output is entirely predictable for the case the RNG fails, I also know that:

$$H(K|B)=0$$

Since it is predictable, there is no surprise and therefore no entropy included with the result. However, I don't know how to determine the overall entropy of the RNG. I know conceptually that it needs to be less than $n$ since it's less than perfect randomness due to the failure. Using Bayes' Theorem I can get to the following.

$$H(K)=H(K|A)+H(K|B)-H(A,B)$$

However since $A$ and $B$ are mutually exclusive, that doesn't seem to work out. Am I missing a step or have I made a conceptual error with my approach to this? Thanks in advance

2

There are 2 best solutions below

4
On BEST ANSWER

it's easy enough to just calculate directly.

case $A$ occurs with probability $p$ and you have probability $1/2^n$ for each bit string here.

case $B$ occurs with probability $1-p$ and you have probability $1$ for the all ones bitstring.

So what is the probability of each bitstring?

A string that is not all ones has probability $p(x_i) = p / 2^n$ where $x_i \not = \text{ones}$. There are $2^n - 1$ of these.

The all ones string has probability $p(\text{ones}) = p / 2^n + 1 - p$.

\begin{align*} H =& - \sum p(x_i) \log(p(x_i)) \\ =& - \left[\sum_{x_i \not = \text{ones}} p(x_i) \log(p(x_i))\right] - p(\text{ones}) \log(\text{ones}) \\ =& - \left[\sum_{x_i \not = \text{ones}} p/2^n \log(p/2^n)\right] - p(\text{ones}) \log(\text{ones}) \\ =& - \left[(2^n - 1) p/2^n \log(p/2^n)\right] - p(\text{ones}) \log(\text{ones}) \\ =& - (2^n - 1) p / 2^n \log(p / 2^n) - (p / 2^n + 1 - p) \log(p / 2^n + 1 - p) \end{align*}

0
On

In your description, $A$ and $B$ are event, you may want to think of them as being the indicator of $A$ and $B$ instead that makes them random variable, with $\mathbf 1_A=1-\mathbf 1_B$. This means that $H(\mathbf 1_A,\mathbf 1_B)=H(\mathbf 1_A)=h_2(p)$. Note that I don't really know what is and how you got $H(K|A)$ and $H(K|B)$.