We have a source $X$ with alphabet size equal to $N$. The Shannon entropy is defined as $$E(X)=-\sum _{i=1}^{N}p_{i}\cdot\log _{2}p_{i}$$ where $p_i$ is the probability of symbol $i$ appearing in the stream of characters of the message.
If the source transmits $M>N$ symbols so that the resulting signal is periodic (i.e. it "completes a pattern within a measurable time frame, called a period and repeats that pattern over identical subsequent periods", from Wikibooks) what is the entropy worth in this case?