What is the most efficient way to simulate a biased coin by tossing another biased coin?

3.1k Views Asked by At

There is a biased coin, whose head independently appears with a probability $p\neq \dfrac{1}{2}$ in each tossing.

I want to toss this coin multiple times to simulate another biased dice whose head appears with a probability $q\neq \dfrac{1}{2}$.

I hope an approach with minimal tossing times.

2

There are 2 best solutions below

0
On BEST ANSWER

Here is one way to do it:

First we describe how to generate a biased coin flip with parameter $q = 0.q_1 q_2 \ldots$ with a fair coin. Generate a sequence of fair coin flips $X_1,X_2,\ldots$ and consider the number $C=0.X_1X_2\ldots$ which is Uniform$([0,1))$. We say that our biased coin flip is $1$ if $C<q$ and $0$ otherwise.

We do this by generating $X_1, X_2, \ldots$ for each biased coin flip we want and output a $1$ the first time $X_i < q_i$ and output a zero the first time $X_i > q_i$.

On average, this uses 2 coin flips since the probability the $i$-th fair coin flip is the first to differ from $q$ follows a Geometric$(\frac{1}{2})$ distribution.

[This is problem 5.35 in Cover and Thomas, Elements of Information Theory 2e]


Now, we describe how to generate fair coin flips from a biased coin.

Take two flips of a biased coin with parameter $p$. If they are both the same, flip them again. If they give pattern $10$, generate a $1$. If they give pattern $01$, generate a $0$. Thus, you can generate a fair coin flip from (on average) $2 (\frac{1}{p(1-p)})$ coin flips (the number of pairs of coins which need to be flipped follows a Geometric$(p(1-p))$ distribution, and each pair of coins needs 2 coins).

[This is from "Various techniques used in connection with random digits", By Von Neumann (1951)]


Thus, first generating fair coins from the first biased coin, and then biased coins from the fair coin, we need on average $2 (\frac{1}{p(1-p)}) (2) = \frac{4}{p(1-p)}$ coin flips of the biased coin with parameter $p$.

Note that you can do better for certain values of $p$ on average if you use a different generation method (also see this paper by Yuval Peres - "Iterating Von Neumann's Procedure for Extracting Random Bits").

Also, if $p$ and $q$ have some particular relationship, you may also be able to optimize further without going through the biased->fair->biased chain.

1
On

Without specific values for $p,q$ the answer has to be general. You divide the space in such a way that you converge on the proper value. So suppose $p \lt q$ If you throw the coin once and get heads, you get heads on the second coin. If you get tails, you have used up $p$ of the $q$ chance for heads, so you want to call tails with probability $\frac {q-p}{1-p}$ and heads with probability $\frac {1-q}{1-p}$ . Throw the coin again. One result will determine the outcome, otherwise assess what odds you need on the third throw. Keep going until the outcome is determined.