Alice and Bob are playing a game. Alice (A) and Bob (B) agree on a protocol before the game starts.
Alice gets two uniformly random integers (without duplicates) from 1 through $n$, chooses one of the two numbers, and communicates it to Bob. Bob doesn't get to see anything other than the number that Alice chose (and he knows the maximum $n$, which is fixed for the game).
How many bits of information $b(n)$ can possibly be transmitted per number on average, if Alice and Bob play this game for a sufficiently long time?
$b(2) = 1$ is quite obvious, but even $b(3)$ is a difficult to reason about.
One protocol I came up with is that A always output $3$ for inputs $\{1, 3\}, \{2, 3\}$, and output $1$ or $2$ to transfer $1$ bit of information when the input is $\{1, 2\}$. This gets an average transmission rate of $\frac{1}{3}$ bits per number. But I have no idea how to generalize this to bigger $n$ or whether it's even optimal.

Alice divides the integers 1 through n into two distinct subsets T and F (for example T = numbers <= n and F = numbers > n, or T = even numbers and F = odd numbers). If n is even then there are n/2 numbers in each subset, if n is odd then she has two sets with n/2 numbers plus an extra number (which can be considered to be the sole element of a third set O).
To transmit TRUE she sends a number from set T if possible, otherwise from O if possible, otherwise from F if both her numbers are in that set. To transmit FALSE she does the opposite.
Bob assumes any T number is TRUE, and F number is false, and any O number carries no information.
For n = 2, 3, 4, 5, 6, 7... the number of transmitted bits is 1, 2/3, 2/3, 3/5, 3/5, 4/7, 4/7... . I think the pattern is obvious. For sufficiently large n, the limit is 1/2 bits per transmission.