I am interested in the following two questions related to the famous Bernoulli factory problems.
Bernoulli factory for certain functions: suppose we are given a coin with arbitrary (unknown) head probability $p$, I am wondering if there is an easy-to-implement algorithm for generating a $\min\{p, 0.5\}$ coin for any $p\in [0,1]$. Function $f(p) = \min\{p, 0.5\} \geq \min\{p, 1-p\}$ so the famous paper by Keane and O’Brien guarantees such an algorithm exists. Meanwhile, the function looks slightly simpler than the function discussed in Nacu and Peres, so I am wondering if there is any existing simple algorithm for simulating this. Similarly, can we simulate $f(p) = \min\{p,a\}$ for fixed $a\in (0,1)$ efficiently?
High dimensional Bernoulli factory: How to simulate a coin with head probability $f(p_1, p_2)$ (for example $f(p_1, p_2) = \min\{p_1, p_2, 0.5\}$), or more generally, simulating a coin with head probability $f(p_1, \cdots, p_n)$. The setting here is that we are able so simulate a $p_i$ coin, but without further knowledge on the value of $p_i$, which is consistent with the traditional Bernoulli factory's setting. When $f$ is a single variable function then the problem is reduced to the Bernoulli factory problem.
Any reference/idea will be much appreciated for the multivariate case, thanks!
I reproduce my answers from Cross Validated:
On number 1:
Since the function min(λ, c) meets the Keane–O'Brien theorem, this implies that there are polynomials that converge from above and below to the function. This is discussed, for example, in Thomas and Blanchet (2012) and in Łatuszyński et al. (2009/2011). However, neither paper shows how to automate the task of finding polynomials required for the method to work, so that finding such a sequence for an arbitrary function (that satisfies the Keane–O'Brien theorem) remains far from trivial (and the question interests me to some extent, too).
But fortunately, there is an alternative way to simulate min(λ, 1/2) without having to build a sequence of polynomials explicitly. This algorithm I found is given below, and I have a page describing the derivation of this algorithm.
I suspected that the required degree d would be floor(m*2/3)+1. With help from the MathOverflow community, steps 3 and 4 of the algorithm can be described more efficiently as follows:
In addition, there is an approximate way to sample min(λ, c) and most other continuous functions f that map (0, 1) to (0, 1). Specifically, it's trivial to simulate an individual polynomial with Bernstein coefficients in [0, 1], even if the polynomial has high degree and follows the desired function closely (Goyal and Sigman 2012):
To use this algorithm, simply calculate a[j] = f(j/n), where n is the desired degree of the polynomial (such as 100). Each a[j] is one of the Bernstein coefficients of a polynomial that closely approximates the function; the higher n is, the better the approximation.
On number 2:
A related problem is the Dice Enterprise problem first given by Morina et al. 2019/2020, which involves simulating a m-faced die with a n-faced die, where the faces have separate probabilities of occurring. This is not the same as the problem in your question, though, as these probabilities are interrelated rather than independent.
REFERENCES:
EDIT (Nov. 13):
A very relevant paper was just made available and came to my attention:
Niazadeh, R., Leme, R.P., Schneider, J., "Combinatorial Bernoulli Factories: Matchings, Flows, and Polytopes", arXiv:2011.03865v1 [cs.DS], Nov. 7, 2020.
(Edited Apr. 4):
Let $f:\mathcal{P}\to [0, 1]$ be a function.
In the traditional Bernoulli factory problem, we have a coin with unknown probability of heads $\lambda$ and we seek to sample the probability $f(\lambda)$. In this case, the domain $\mathcal{P}$ is either $[0, 1]$ or a subset thereof.
The paper cited above, however, studies Bernoulli factories when $f$ has a different domain: namely when $\mathcal{P}$ is a "polytope contained in the unit hypercube", that is, either $[0,1]^n$ or a subset thereof. Now the Bernoulli factory problem is as follows:
For example, given two coins the vector is $x = (\lambda_1, \lambda_2)$, and one possible choice of $\mathcal{P}$ is $[0, 1]\times[0, 1]$. In that case, there are four vertices, namely $(0, 0)$, $(0, 1)$, $(1, 0)$, and $(1, 1)$. When this procedure is done enough times, the sampled vertices average to $x$.
The main question studied in this paper is whether and how a polytope of the kind just given admits a Bernoulli factory.
Unfortunately, this is a different problem from the Bernoulli factory problem you give in your question, where the goal is to sample the probability $f((\lambda_1, \lambda_2, ..., \lambda_n))$ given $n$ coins each with unknown probability of heads of $\lambda_1, ..., \lambda_n$ as the case may be. Moreover, the use of the term "Bernoulli factory" as studied in the paper may be misleading, because the output of the algorithm is not necessarily a Bernoulli random variable. (This is so even though the paper defines "one-bit Bernoulli factories" similarly to traditional Bernoulli factories.) As a result, the paper didn't study any conditions on $f$ that are necessary for a Bernoulli factory of the kind you give in your question, such as whether $f$ has to be continuous or bounded away from its domain.
Nacu & Peres found algorithms to solve the Bernoulli factory problem using approximation theory. Currently, polynomials and rational functions are the main kinds of multivariate functions I am aware of that have Bernoulli factory algorithms (Morina et al.) Many multivariate analytic functions are also taken care of by the composition rule (e.g., proposition 14(ii) of Nacu and Peres).
However, although the function $g = \min(\lambda_0, \lambda_1)$ is continuous, it's not differentiable, which presents an apparent difficulty.
But by the Stone--Weierstrass theorem, any continuous function on $[0, 1]^d$ can be approximated arbitrarily well by polynomials, so I suspect the following:
This will require looking at the research on multivariate polynomial approximation (especially research that gives bounds on the approximation error with polynomials, especially multivariate polynomials in Bernstein form).
It's also possible that for the particular function $g$, a series expansion exists whose terms can be "tucked" under a discrete probability mass function, which enables an algorithm to simulate $g$ via convex combinations (Wästlund 1999, Theorem 2.7), as is the case for $\min(\lambda, 1/2)$, for example.
EDIT (Sep. 28, 2021): Also, see chapter 3 of G. Morina's doctoral dissertation (2021), which shows that multivariate Bernoulli factories require the function to be continuous and polynomially bounded.
EDIT (Feb. 18, 2022): A new paper by Leme and Schneider (2022), "Multiparameter Bernoulli Factories", deals with the problem you're asking about. Among other things, they show that a function $f(p_1, ..., p_n)$ admits a Bernoulli factory if and only if $f$ is continuous and meets a polynomial boundedness condition that reduces in the 1-dimensional case to that found in Keane and O'Brien.
REFERENCES: