Given a biased coin $P(X=0)=.75$, I've been unable to find a coding scheme which beats the identity code of $0\to0$ and $1\to1$, which of course is an efficiency of 1 bit per transmission.
The entropy of the coin is $H(X)\approx.81$, and I'm aware that there's a 1 bit overhead for coding schemes restricted to a single transmission, but so long as the length $n$ of the transmission is sufficiently large, this should be possible.
I know virtually nothing about algebraic coding theory, so most of the codes I've tried have been basic, mostly just sending long strings of zeros to shorter codes using the Huffman coding scheme, but I haven't been able to get below 1 bit, I'm not even trying to get close to the Shannon limit, I'd just like to see a code which gets below 1 bit for $n$ sufficiently large.
$n=2$ is sufficiently large. We then have $$\begin{align} P(00) &= 9/16 \\ P(01) &= 3/16 \\ P(10) &= 3/16 \\ P(11) &= 1/16 \end{align} $$
And so we can construct a Huffman code:
The expected cost of sending two coin flips is now $$ 1 \cdot \frac9{16} + 2 \cdot \frac 3{16} + 3 \cdot \frac{3+1}{16} = \frac{27}{16} $$ so the expected cost per coin flip is $\frac{27/16}2 = \frac{27}{32} < 1 $.