The title says it all. How can you take a sequence of random coin flops $x_1,x_2,\ldots,x_n\sim\operatorname{Bernoulli}(1/2)$ and generate a sample from a coin $X \sim \operatorname{Bernoulli}(1/3)$ with $n$ as small as possible?
If this isn't possible exactly, how about a sample of $\tilde X \sim \operatorname{Bernoulli}(1/3 \pm\epsilon)$ for some small $\epsilon>0$?
I know of algorithms to sample $X$ using uniform random numbers but I can't think of a procedure using finite randomness.
There's no procedure with a finite number of flips that will work, since the probability of any event involving a finite number of coinflips will be a rational number with the form $m/{2^n}$, and $1/3$ can't be written in this form.
To get a good approximation with $n$ coins you can let $m$ be the closest integer to $2^n/3$ and then view your sequence of flips as the expression of a number in binary and ask whether or not this number is less than $m$. The probability of success is $p=m/2^n$, and since $m=2^n/3\pm1/3$ we have that $p= 1/3 \pm 1/(3\times 2^n)$. So we can make $\epsilon$ exponentially small with respect to $n$.
Another alternative is to flip coins until you get a head, and then look at whether the total number of flips was odd or even. The probability of it being even is $1/4 + 1/16 + 1/64 + \dots = 1/3$. This procedure could potentially go on forever (if you never get a head), but the expected number of flips is only $1\times 1/2+2\times 1/4+3\times1/8+\dots = 2$.