Suppose Alice and Bob agree that she will send him a sequence of $4$ bits of data which are not all $1$s. This is a simple example of a communication protocol as defined, for example, in Wikipedia. The result is that Alice can make a free choice among 15 options and send that choice to Bob. So this protocol effectively sends slightly fewer than $4$ bits of information: to be precise, $\log_2 15$ bits. Similarly, for any integer $n >= 1$, there's an obvious protocol which effectively sends $\log_2 n$ bits of information.
Are there any other possibilities here? That is, for what real values $r$ is there a protocol which effectively sends $r$ bits?
In particular, is there a protocol which sends more than zero bits but less than one bit? If there were, then we would expect that Alice cannot use it to send one bit, but can use several repetitions of the same protocol to send one bit.
I suspect the answer is no. I'd appreciate any thoughts on how to prove it.
I don't have a definition of "protocol" or "effectively sends", but the idea seems pretty clear. Here are some more details. (1) We're assuming that the basic communication channel has no noise. (2) If she sends the forbidden $4$-bit sequence, we consider that the protocol was not completed; it's much the same as if she sent $3$ bits and then stopped. (3) We should allow a protocol to specify multiple messages back and forth between Alice and Bob.
Edit: expanded for clarity.
First, you need to distinguish between source entropy and the amount of information per bit.
Note that by "slightly fewer than 4 bits of information" or $\log_2(15)$ you are referring to the source entropy with an implicit assumption that all words are equally likely (assume the source as a dictionary).
Here you send $4$ actual bits. If your dictionary had two equally likely words and you still used $4$ bits, then $\frac{\log_2(2)}{4}=0.25$ bits of information per actual bit was communicated.
You could also consider a different source where the word probabilities are such that the source entropy (information) is less than one bit. In such case, regardless of what "protocol" you use, no matter how many actual bits it uses, the amount of information you communicate is equal to the source entropy which is less than one.
Edit: Here is an example for more clarity. The symbol probabilities of two binary sources $S_1$ and $S_2$ with 15 symbols as well as their entropies are as follows:
where entropy is $H=-\sum_i p_i\log2(p_i)$
$S_1$ is the source assumed in the question whose entropy is $3.875$. However, the entropy of source $S_2$ is $0.5429$.
You can see it is not the "protocol" that governs the amount of information but the source itself.