Let us say you have a random variable $R$. How would one generate a uniform random variable $U$, with the maximum possible entropy (or infinite entropy, if $R$ has such)? (For simplicity, you may assume that $R$'s sample space is a subset of the real numbers.)
Note: $U$ is essentially gotten by applying a function to $R$. I believe what I am seeking is called a "randomness extractor", specifically one that preserves entropy, and works based a probability distribution.
Here's an idea, assuming that you don't know anything about the distribution of $R$.
Let $(A_i)_{i\in\mathbb{N}}$, $(B_i)_{i\in\mathbb{N}}$ be samples of $R$, i.e. i.i.d. random variables with the distribution of $R$, and set $$ C_i = \begin{cases} -1 & \text{if $A_i < B_i$,} \\ 1 & \text{if $A_i > B_i$,} \\ 0 & \text{if $A_i = B_i$.} \ \end{cases} $$ Then let $D_i$ be the $i$-th value of $C_i$ which isn't zero.
By construction, the $C_i$ are mutually independent, and so are thus the $D_i$. By symmetry, $P(C_i = -1) = P(C_i = 1)$, so the $D_i$ are uniformy distributed on $\{-1,1\}$. You can then use the $D_i$ to produce uniformy distributed values on larger probability spaces.