Assume an alphabet in which all letters have the same probability. These letters are coded using a binary Huffman code. Proof that the difference in codeword length is less than 2.
It seems intuitively right but I do not know how to prove this formally.
Exact question is:
A source has an alphabet of $K$ letters, all having the same probability. These are coded using a binary Huffman code. Assume that $K = x \,2^j$, with $j$ integer and $1\le x <2$. Prove that all codewords have lengths $j$ or $j + 1$. Hint: Prove, using the code tree, that the difference in codeword length is less than $2$.
It's apparently a modified version of question 2.14 from "Principles of Digital Communications" (Chapter 2: Coding for Discrete Sources) by Robert Gallager.