I am self-studying Information Theory and came across this problem concerning the entropy rate of a random walk across this graph. For all logarithms, I am working in base 2.
$$\mu=(3/16, 3/16, 3/16, 3/16, 4/16)$$
Where each $\mu_i$ is the consequence of being the weight of the nodes emanating from it over the sum of the weights of all the nodes. Since this graph is undirected and unweighted, this simplifies to $\frac{E_i}{2E}$ with E being the total number of edges and $E_i$ being the number of edges at node i.
The entropy rate is thus: $$log(2E)-H(\frac{3}{16}, \frac{3}{16}, \frac{3}{16}, \frac{3}{16}, \frac{4}{16})$$
While this value is fine, I am a bit concerned with calculating the entropy for this joint variable considering that, after a few expansions it looks like it may be unwieldy. I managed to find a claim that this value is the same as $$4(\frac{3}{16}log_2(3))+(\frac{4}{16}log_2(4))$$
I can see the basic decomposition of where each value comes from but am unsure about why this value holds. The claim seems to be that the entropy rate across this graph is equal to $$\sum_{i=1}^5\mu_ilog(\frac{1}{E_i})$$
Is the idea that we are converging to the stationary distribution thus we are essentially just calculating the entropy for some very large n? The $\mu_i$ seems like it can be likened $P(X_n=x)$ but what is $\frac{1}{E_i}$? To me, it seems like that term inside the summation should instead be $\mu_ilog(\frac{1}{m_i})$. I am wondering if the above claim was just a typo?
Additionally, how would I calculate that joint entropy from the first equation? Im sure I am overcomplicating it, but it seems like the chain rule is the only method for expanding it (unless stationarity or some other factor plays a role).

I was able to come to an answer:
$$4*\frac{3}{16}log_2(3)+\frac{4}{16}log_2(5)=\sum_{i=1}^5\mu_iH(X_2|X_1=i)$$
The reason this holds is because the next state of the process is going to be uniformly distributed over the edges conditioned on the fact that you are at a certain node. Since, for a uniform distribution, we achieve maximal entropy, the entropy is equal to the upper bound of $log(|C_i|)$ where $|C_i|$ is the number of allowable next states given we are at node i. This is exactly equal to the number of edges emanating from the node hence the equality follows.