Given the length of the codeword (i.e. the binary representation of a characters: $1, 1010, 00$, etc.) of each symbol in an alphabet, how could I calculate the bit per symbol entropy?
The particular problem I'm solving has the alphabet $A=\{a_1,a_2,a_3,a_4,a_5\}$ with the probabilities:
$$P(a_1)=0.4,\quad p(a_2)=p(a_3)=0.2,\quad p(a_5)=p(a_4)=0.1,$$
which has an entropy oh $H(S) = 2.278$ bits/symbol and average length of $L = 2.2$ bits/symbol. Finally the coding is done with no regards to variance, if it helps.
Consider this Huffman tree
$$(a_1,((a_2,a_3 ),(a_4,a_5 ) ) ),$$
in which the codes for the $5$ symbols are $a_1=0$, $a_2=100$, $a_3=101$, $a_4=110$, $a_5=111$. The average word length (bits per symbol)
$$\bar{L}=\sum_{i=1}^5P(a_i)L(a_i)=0.4\times 1+0.6\times 3=2.2$$
as you calculated, and the Shannon entropy (information content) per symbol
$$S=-\sum_{i=1}^5P(a_i)\log_2P(a_i)=\log_210-1.2=2.1219\mbox{ bits}.$$
Huffman code uses on average $2.2$ bits to code $2.1219$ bits of information.