I have an exercise in Randomized algorithm it takes me time to answer it but this king of question is new for me and I need a hint to begin with.
Problem 1.3 [Motwani and Raghavan's textbook] (Due to D.E. Knuth and A. C-C. Yao) Suppose you are provided with a source of unbiased random bits. Provide efficient (in terms of expected running time and expected number of random bits used) schemes for generating samples from the distribution over the set {2,3,4, ...,12} induced by rolling two unbiased dice and taking the sum of their outcomes.
In simple words, we need to make fair coin which produces an equivalent two fair dice that takes the sum of their outcomes. Also, we need to design an algorithm that does this job.
The problem that I have is I have 36 different possible of rolling two unbiased dice. But if you roll a coin you can have 32 or 64 possible values but you cannot have 36. So, there must be a way to make a re-step if we have a particular case. Another problem is that the first case where sum of dice equal 2 happens only once out of 36. Thus, suppose we have 6 digits, then 36x=64, x=1.76. Now how to simulate this 1.76 into discrete and do I need to round it up or down but if we round it up or down, then we do not have an exact simulation between 36 and 64.
I hope someone could answer this question or give me hint.
Thank you!
Split the interval $[0,1[\>$ into half-open subintervals of length ${1\over36}$. Allocate the first of these subintervals to the sum (dice outcome) $2$, the next two to the sum $3$, the nect three to the sum $4$, and so on, until the last subinterval which is allocated to the sum $12$. In this way we have obtained a partition of $[0,1[\>$ into eleven fractions whose lengths correspond to the probabilities of the various sums.
Now draw bits $b_k\in\{0,1\}$ from your random generator and observe the sums $$s_n:=\sum_{k=1}^n b_k 2^{-k}\qquad(n\geq1)\ .$$
Stop when $s_n+2^{-n}$ is in the same fraction as $s_n$.
The random variable $S$ representing the sum of two independent dice throws has entropy $$H=\sum_{k=2}^{12} p_k \log_2\left({1\over p_k}\right)=3.2744\quad {\rm [bits]}\ .$$ This means that in the "Shannon limit" $3.2744$ throws of a coin would be sufficient to simulate one sample of $S$. I made $3.6\cdot 10^6$ trials with the proposed algorithm, and came to $5.062$ drawings of a random bit per sample. This is better than the $6$ bits required in other proposed setups.