I am trying to prove the following identity for a Binomial $(n,p)$ random variable $X$,
$$ H(X)= n h_2(p) + \mathbb{E}{n \choose X} $$
I've started with the definition of entropy but I am unable to juggle the terms appropriately to reach the above identity.
We have $P(X=i)=\binom n i p^i (1-p)^{n-i}$. So $$ H(X) = -\sum_{i=0}^n P(X=i)\lg P(X=i)$$ $$ = -\sum_{i=0}^n \binom n i p^i (1-p)^{n-i}\left(\lg \binom n i+i\lg p+(n-i)\lg(n-i)\right).$$ Now let's look at each term: $$-\sum_{i=0}^n \binom n i p^i (1-p)^{n-i}\lg \binom n i=-E\lg\binom nX,$$ $$-\sum_{i=0}^n \binom n i p^i (1-p)^{n-i} i\lg p=-\lg p EX=-np\lg p,$$ $$-\sum_{i=0}^n \binom n i p^i (1-p)^{n-i} (n-i)\lg (1-p)=-\lg (1-p) E(n-X)=-n(1-p)\lg (1-p).$$
So we have $$H(X)=n h_2(p)-E\lg\binom nX.$$