Von Neumman's method allows us to generate a fair coin flip from any unbiased coin flip using only two bits (two tosses) of information (http://en.wikipedia.org/wiki/Fair_coin).
Is the reverse possible? Generating any arbritrary Bernoulli random variable with a fair one ($p = \frac {1} {2}$) with two bits of information. I know we can approximate a uniform distribution with infinite fair Bernoulli tosses. I'm not so sure if this can be done using two bits of information.
It seems to me that with two flips of a fair coin (two bits of information) you can only simulate the probabilities $\frac14, \frac12, \frac34$ (and, trivially $0$ and $1$).
This is because you have an equiprobable sample space of $4$ outcomes, so all events have probability $\frac{k}4$ for some $k\in\{0,1,2,3,4\}$.