Calculating probability of $~p~$ from binary entropy using logarithm

207 Views Asked by At

A person roll a dice using a dice that is not fair The result of dice rolling is reported in even odd and large / medium / small ("Small" mean $\{1,2\}$ , medium means $\{3,4\}$ , large means $\{5,6\}$)

$X$ represent probability distribution where $Pr~\{\text{even, odd}\}$ and $~Y~$ represent probability distribution $Pr ~\{S,M,L\}$

It is known that the probability that the fourth eye of this dice appears and the odd probability are equal, and further that the entropy of the random variable $~X~$ is $~H (X) =0.97095~$ bits. Find the probability of getting $~4~$ eyes ($Pr~ \{X=\text{odd}\}$)

$$H(X)=0.97095= - ~p ~\ln p - (1-p)~\ln(1-p)$$ I need to find $~p~$.

The answer is $~0.6~$ and $~0.4~$, but how can I find it when the logarithm is in base $~2~$? and also how can I find $~p~$ with just only this information and using logarithm property ?

$$\ln 0.97095=\ln \frac{1}{p^p}~ \frac {1}{(1-p)^{1-p}}~?$$

1

There are 1 best solutions below

2
On BEST ANSWER

Just like reading a graph, the below graph relates binary entropy with probability. From this graph, it is clear that p = 0.4 and (1-p) = 0.6 for which H = 0.97095. Algebraically it is difficult to find out.

Good luckenter image description here