Information Theory and Basic Entropy

58 Views Asked by At

Say there is two fair (identical) coins. Let Heads worth one point and tails is worth two points. We flip the two coins at a single instance .

Lets consider the two experiments $X$ & $Y$ on the set $S=$ {2, 3, 4}. Experiment $X$ the set $P_x(x)$ is the probability that the sums of the points on the two coins is $x$. Experiment $Y$ $P_y(y)$ is the probability that the max number of points is $\frac{1}{2}y$ (if we have a tie the max number is the common number).

Determine, with proof wich one of the entropies $H(X)$ & $H(Y)$ is greater.

So I'm having a bit trouble getting started on this problem, first off I don't understand when they say "probability that the max number of points is $\frac{1}{2}y$"

Now I am able to compute the entropy for the first part of the problem. I would appreciate if I could get some feedback if I did it correctly.
$H(X) = -(\frac{1}{3}log(\frac{1}{3})+\frac{1}{3}log(\frac{1}{3})+\frac{1}{3}log(\frac{1}{3}))$

1

There are 1 best solutions below

6
On BEST ANSWER

Let the outcomes of the coins be denoted by $A$ and $B$, where $A,B \in \{1,2\}$. Note that $X=A+B$. Therefore, $$\Pr(X=2)=\Pr(A=1)\cdot\Pr(B=1)=1/4,$$ $$\Pr(X=3)=\Pr(A=1)\cdot\Pr(B=2) + \Pr(A=2)\cdot\Pr(B=1)=1/2,$$ and $$\Pr(X=4)=\Pr(A=2)\cdot\Pr(B=2)=1/4.$$

For the experiment $Y$, we compute $P_Y(y)$ by noting that $Y = 2\max\{A,B\}$. Observe that $$\Pr(Y=2)=\Pr(A=1)\cdot\Pr(B=1)=1/4,$$ and $$\Pr(Y=4)=1-\Pr(Y=2)=3/4.$$ Therefore, $$H(Y)=\frac{1}{4}\log(4)+\frac{3}{4}\log(4/3).$$ Now compare $H(Y)$ with $H(X)$ to determine which one is greater.