I saw the following problem on a contest. Having not studied information theory too much I am wondering how one would solve this: In The Hitchhiker’s Guide to the Galaxy, aliens mistook cars for the dominant life form on Earth. If this were true, we assume that cars would communicate with one another by honking.
Telegraph operators had a similar means of communication, Morse Code, which uses short and long presses of a signal. Suppose that a car uses its horn in the same way, with short presses for 1/4 second, long presses for 3/4 second, and pauses for 1/4 second between presses. One of the pioneers of Information Theory, Claude Shannon, studied the entropy of printed English in 1950, and experimentally estimated that written English had an entropy of 0.6 to 1.3 bits per letter, with an average word length of 4.5 letters. If cars were having a conversation with these parameters, but developed an optimal language for converting their horn presses into their own meanings, what would be the average length of time spent honking each word? For simplicity, assume that the average entropy per word is 4 bits. Round your answer to the nearest hundredth.
The number of honk sequences of length $n/2$ seconds is $F_n$, the $n$th Fibonacci number. As $F_n\sim (\frac{1+\sqrt 5}2)^n$, we have $$\log_2 F_n\approx n\log_2\frac{1+\sqrt 5}2 $$ bits per $n/2$ seconds, so in the limit $$ 2\log_2\frac{1+\sqrt 5}2$$ bits per second. With the assumed $4$ bits per word, this combines into $$ \frac{4}{2\log_2\frac{1+\sqrt 5}{2}}\approx 2.88$$ seconds per word.