This question is probably not so hard for you.
Why is the entropy equal to:
$$ H(x,y)=2\log_2(5)-\frac{8}{25}\log_2(2)-\frac{6}{25}\log_2(3), $$
for the following joint distribution?
$$ p(x,y) = \frac{1}{25} \begin{bmatrix} 1&1&1&1&1\\ 2&1&2&0&0\\ 2&0&1&1&1\\ 0&3&0&2&0\\ 0&0&1&1&3 \end{bmatrix} $$
My answer is: $$ H(x,y)=\frac{11}{25}\log_2(\frac{1}{25})-\frac{8}{25}\log_2(\frac{2}{25})-\frac{6}{25}\log_2(\frac{3}{25}), $$
and the only thing I could think of applying here is $\log(\frac{1}{x})=-\log(x)$ and $\log_2(2)=1$. Which is useful for $-\log_2(\frac{1}{25})=\log_2(25)=\log_2(5^2)=2\log(5)$.
Hope someone can help. Its from this book: https://link.springer.com/book/10.1007/978-0-387-79234-7.
As far as I understand you need help with logarithms simplification? Because I think your calculations are almost (possible typo) ok:
I thing you miss minus sine $H(x,y)=-\frac{11}{25}\log_2\frac{1}{25}-\frac{8}{25}\log_2\frac{2}{25}-\frac{6}{25}\log_2\frac{3}{25}.$ Sincethe joint Shannon’s entropy is given by $$H(X,Y)=-\sum_{x\in X}\sum_{y\in Y}p(x,y)\log p(x,y)$$ while I'm referring to notation from your book Information Theory and Network Coding, Raymond W. Yeung p. 43. So let's simplify logarithms. But at first notice there are many equivalents ways to do this so you (and potential reader) do not have to stick with mine. Since $$-\frac{11}{25}\log_2\frac{1}{25} = \frac{11}{25}\log_2 25 = \frac{22}{25}\log_2 5 \\ -\frac{8}{25}\log_2\frac{2}{25}= -\frac{8}{25}\log_2 2 + \frac{8}{25}\log_2 25 = -\frac{8}{25} + \frac{16}{25}\log_2 5 \\ -\frac{6}{25}\log_2\frac{3}{25} = -\frac{6}{25}\log_2 3 +\frac{6}{25}\log_2 25= -\frac{6}{25}\log_2 3 +\frac{12}{25}\log_2 5. $$ I'm using dwo facts $$\log \xi/\eta =\log \xi -\log \eta \qquad \& \qquad \log \xi^{\eta}=\eta \log \xi. $$ At the end sum up everything $$\frac{22}{25}\log_2 5 -\frac{8}{25} + \frac{16}{25}\log_2 5 -\frac{6}{25}\log_2 3 +\frac{12}{25}\log_2 5 = 2 \log_2 5 -\frac{8}{25} -\frac{6}{25}\log_2 3 $$ what is the expected answer.