If A and B are two independent random variables with $n_A$ and $n_B$ number of possible values, then the joint random variable AB would have $n_A \times n_B$ number of possible values and the joint entropy is simply additive
$$H(AB) = H(A) + H(B)$$ But what is $H(AB)$ if A and B are not independent, for example, if they are identical?
We can directly compute the case of when they are identical:
$$H(A, B) = \sum_{a, b} p(a, b) \log p(a,b) = \sum_{a = b} p(a, b) \log p(a,b)$$
where in the second equality we applied the assumption that $A = B$ with probability $1$. Then, since $P(A=a, B=a) = P(A=a)$ (since $A=B$), we have:
$$ = \sum_a p(a)\log p(a) = H(A) = H(B)$$