Suppose I have biased coins that all land heads with probability $p\in(0,1)$. Then I can simulate a fair coin by flipping a pair of biased coins until I get heads-tails or tails-heads. I associate to each respectively heads and tails. Since the probability of either is $p(1-p)$, I have extracted a fair coin out of a pair of biased ones.
This is a well-known trick that is often introduced to undergraduates in probability. What I want to know is: Are there other ways to extract "fair" distributions out of "biased" ones?
Two specific examples of what I am asking:
I have a countable infinite sequence of i.i.d. Poisson random variables $X_i$ with mean $\lambda\in\mathbb R_{>0}$. Can I simulate a Poisson random variable $X$ with mean $1$?
I have an uncountable collection of i.i.d. normal random variables $X_t$ with mean $\mu\in\mathbb R$ and variance $\sigma^2\in\mathbb R_{>0}$. Can I simulate a normal random variable $X$ with mean $0$ and variance $1$?
I am not sure if this is important, but I do not mind if the process to generate a "fair" distribution takes infinite time, so long as it converges in some reasonable sense.
I'm not sure about the Poisson example, but when you have a random variable with a contionuous distribution, you can simulate any other distribution.
If $X$ has a continuous distribution with CDF $F$, then $F(X)$ is distributed uniformly in $[0, 1]$. From a uniform random variable you can generate any distribution via inverse transform sampling.
Specifically for your Gaussian example there's an even simpler method. Just note that for any Gaussian Random variable $X \sim \mathcal{N}(\mu, \sigma^2)$ we have $aX + b \sim \mathcal{N}(\mu + b, a^2 \sigma^2)$.
There is also a neat way to create any unfair coin from multiple throws of a fair coin. Assume you want to "simulate" a coin that shows heads with probability exactly $p$. To do this, flip the fair coin repeatedly until it shows heads for the first time and let $n$ be the number of coin flips you just did. If the $n$-th decimal place in the binary representation of $p$ is a 1, your unfair coin show sheads - otherwise tails.
Edit: The Gaussian problem is also solvable if $\mu$ and $\sigma^2$ are unknown. First of all note that $X_1 - X_2 \sim \mathcal{N}(0, 2\sigma^2)$, so we can create random variables with mean zero and unknown variance. Now for example we could calculate $\frac{X_1 - X_2}{X_3 - X_4}$, which has the same distribution as the quotient of two independent standard Gaussian variables (note the we have eliminated the need to know the variance by forming a quotient). In theory we could calculate its CDF and then apply the inverse transform sampling again.
More practically, we could use the fact that for independent random variables $X \sim \chi^2(m)$, $Y \sim \chi^2(n)$ we have $\frac{X}{X + Y} \sim \beta(m/2, n/2)$ (see here). Since the $\beta(1, 1)$-distribution is the uniform distribution on $[0, 1]$, we can use this to conclude $$\frac{(X_1 - X_2)^2 + (X_3 - X_4)^2}{(X_1 - X_2)^2 + (X_3 - X_4)^2 + (X_5 - X_6)^2 + (X_7 - X_8)^2} \sim U([0, 1]).$$ From here on, we can again use inverse transform sampling.