An RNG is reported to have an entropy of $2.521$ bits for 6 results. This is biased from a uniform random system which would have $2.585$ bits at maximum entropy.
You want to create a betting game where you test two samples to see if they match. Let's say the standard bet is $1$ credit. The odds should be a $1$ in $6$ chance of a match so the payoff in a totally fair game should be $6$ credits. However, due to the bias, the payoff will not be fair as matching will be at a higher rate than a theoretical $1$ in $6$. What should the payoff be?
In solving for this myself I looked at the measured entropy of $2.521$. Knowing that maximum entropy would be $\log_2(X) = 2.521$, I solved for X which gives $X = 2^{2.521}$, $X = 5.739798$.
So the odds must be closer to $\frac{1}{5}$.$739798$ than they are to $\frac{1}{6}$. However when I simulate this, while I am closer the odds estimate of $17.422%$ for a match are less than what the simulation came up with, which was $18.364%$ Is there a better way of determining an estimate without sampling the random stream of values from the RNG? Is there a better way of taking the entropy to calculate the odds? I see that $\frac{2}{2^x} - \frac{1}{6}$ is closer, but I don't know why that works better.
