Let $\mu_1, \mu_2$ be two probability distributions on a sample space $X$ and let $0 < \alpha < 1$. Define the entropy of a probability distribution $\mu$ to be $$H(\mu) = - \sum_{t \in X} \mu(t) \log(\mu(t))$$ Define a new probability distribution $\alpha \mu_1 + (1 - \alpha)\mu_2$. I am trying to show that $$ H(\alpha \mu_1 + (1- \alpha)\mu_2) \geq \alpha H(\mu_1) + (1 - \alpha) H(\mu_2)$$ i.e that the entropy is concave.
I've tried; $$ H(\alpha \mu_1 + (1- \alpha)\mu_2) = \sum_{t \in X} (\alpha \mu_1 + (1 - \alpha)\mu_2) \log (\alpha \mu_1 + (1 - \alpha)\mu_2)$$ and tried to use the fact that $\log$ is concave, so $$\log (\alpha \mu_1 + (1 - \alpha)\mu_2) \geq \alpha \log(\mu_1) + (1-\alpha)\log(\mu_2)$$ but I don't seem to be getting anywhere? Any help would be appreciated!
Let $f(x)=-x\log x$ for $x>0$. Then $$f^{\prime\prime}(x)=-\frac 1 x<0$$ Therefore $f$ is concave. This yields the result.