Can we generate the ratio of two unknown probabilities?

133 Views Asked by At

Suppose we have two coins $A$ and $B$, where coin $A$ comes heads up with unknown probability $p_A(0<p_A<1)$, and coin $B$ comes heads up with probability $p_B(0<p_B<1)$, and we would like to get a sample with probability $\min(\frac{p_A}{p_B},1)$, is there a general way of doing this? And if possible, what is the expected number of coin tosses?

An idealized way of doing this would be some coupling: suppose we don't have coins with a probability of heads up being $p_A$ and $p_B$, but instead, we have oracles $O_A$ and $O_B$, where when given a real number $r\in [0,1]$ as input, $O_A$ can answer if $r<p_A$ or not(This oracle is much stronger, as we can now simply binary search to find $p_A$!). Then we can easily get a sample with probability $\min(\frac{p_A}{p_B},1)$ by doing the following:

Generate a random real number $r\in [0,1]$ and use $O_A$ and $O_B$ to check if $r<p_A$ or $r<p_B$. If so, output $1$ if $r<p_A$, otherwise output $0$. If both $r\geq p_A$ and $r\geq p_B$, then repeat. This procedure clearly outputs $1$ with probability $\min(\frac{p_A}{p_B},1)$ when it terminates, and its expected time of oracle queries is $\frac{1}{\max(p_A,p_B)}$.

However, that's with the help of a much stronger oracle. I wonder if one can do the same or simulate the oracle process with the coin model. Or could anyone please give some reference to this problem?

As pointed out in the comment session, it may not be possible to generate a sample with probability $\min(\frac{p_A}{p_B},1)$, then is it possible to generate a sample with probability $\frac{p_A}{p_A+p_B}$?

1

There are 1 best solutions below

0
On BEST ANSWER

Your question is related to a body of literature known as the "Bernoulli factory problem": Given a coin with unknown probability of heads $\lambda$, sample the probability $f(\lambda)$. One important result is that $f(\lambda):\mathcal{D}\subseteq(0,1) \to [0,1]$ can be sampled this way if and only if $f=0$, $f=1$, or $f$ is continuous and polynomially bounded on $\mathcal{D}$ (both $f$ and $1-f$ are bounded below by min($\lambda^n$, $(1-\lambda)^n$) for some $n$), so that for example, it's impossible to simulate $2p_A$ with coin $A$ (that is, to double the chance of heads) when $p_A$ can be any value in $(0, 1/2)$ (Keane and O'Brien 1994).

And it follows that without more information on $p_A$ and $p_B$, it's likewise impossible to sample the probability $\frac{p_A}{p_B}$.

On the other hand, there is an algorithm to sample the probability $\frac{p_A}{p_A+p_B}$, as follows (Gonçalves et al. 2017):

  1. Flip a fair coin. (If this is unavailable and assuming coin $A$ is not one-sided, then just flip $A$ twice until both faces are different then take the first face as the fair coin result [von Neumann 1951].)
  2. If the fair coin shows heads, flip coin $A$ and return heads if that coin shows heads.
  3. If the fair coin shows tails, flip coin $B$ and return tails if that coin shows heads.
  4. Go to step 1.

We can see that the expected number of flips of $A$ and $B$ total, other than fair coin flips, is $1/(\frac{1}{2} p_A + \frac{1}{2} p_B)$.

There are numerous other "Bernoulli Factory" algorithms like this for numerous other functions.

REFERENCES:

  • Gonçalves, F. B., Łatuszyński, K. G., Roberts, G. O. (2017). Exact Monte Carlo likelihood-based inference for jump-diffusion processes.
  • Keane, M. S., and O'Brien, G. L., "A Bernoulli factory", ACM Transactions on Modeling and Computer Simulation 4(2), 1994.
  • von Neumann, J., "Various techniques used in connection with random digits", 1951.