I have a sequence of binary questions $(U_1,\dots, U_N)$ with some distribution. I know the answer to $n\leq N$ (mod-)adjacent questions, and want to convey this knowledge with as few bits as possible.
If I do a large number of trials, is there a way to send negligibly more than the amount of information in the answers on average?
If so, what is this method called? I have been thinking about Huffman codes and variable length codes, but am not sure where to turn.
$$\mathbf{Example}$$
Say $N=4, n=2$. We are interested in a vector $(U_1,U_2,U_3,U_4)$ that has some known distribution. With equal probability we know one of the following sets: $$A_1=\{U_1,U_4\}, \ A_2=\{U_1,U_2\},\ A_3=\{U_2,U_3\}, \ \mathrm{or} \ A_4=\{U_3,U_4\}.$$ Say we happen to know $A_1$, in particular that $U_1=1, \ U_4=0$. How do we convey $(1,?,?,0)$ in as few bits as possible?
We can apply a trinary Huffman code. If this is not clear enough, here is how to do it for the example given in the question:
The same idea generalizes to arbitrary $N,n,$ and any collection of sub-vectors $A$. As $N,n$ become large (say, by running multiple trials) the Huffman code gets longer and the overhead incurred by it becomes arbitrarily small.