The Dirichlet Distribution basically defines the probability that a sample came from a particular multinomial distribution if we assume that the prior probability of all multinomial distributions having generated the sample are equal.
Each multinomial distribution has a corresponding categorical distribution, and the entropy of that categorical distribution is given by
$$-\sum_x^{states}\Pr(x)\ln(\Pr(x))$$
Given a point $p=(p_1,p_2,p_3...p_n)$ randomly chosen according to a Dirichlet Distribution with parameters $k_1...k_n$, such that $\sum_ip_i=1$, the entropy of the corresponding categorical distribution is:
$$H(p)=-\sum_i^n p_i \ln(p_i)$$
What the expected value of $\text H(p)$?
In the special case where the Dirichlet Distribution is just defined by $k_1$ and $k_2$ and $p$ is 2-dimensional, the expected entropy $\text{H}(p)$ is given by the formula $$\frac{(k_1+k_2) H_{(k_1+k_2-1)}-k_1 H_{k_1}-k_2 H_{(k_2-1)}}{k_1+k_2}$$
Where $H_n$ is the $n$th harmonic number, however I haven't been able to calculate the answer for greater numbers of dimensions.
So as it turns out, the general closed-form solution is
$$\text{Exp}(H(P))=H_A-\frac{1}{A}\sum _{i=1}^m \alpha _i H_{\alpha _i}$$
Where $m$ is the number of different states, $H(P)$ is the entropy of probability distribution $P$ where each state $s_i$ occors with probability $p_i$, the $\alpha _i$ are the distribution parameters of the dirichlet distribution $P$ is drawn from, and $A=\sum_{i=1}^m \alpha _i$. Each $\alpha_i = k_i+1$.
I believe the derivation is obvious.
JK the derivation isn't obvious. I found a closed-form solution when $n=2$, when $n=3$, and when $n=4$, and then I fiddled with those until I came up with a general formula that looks elegant (see above), and then tested the formula against a monte-carlo estimation of the expected value. It works but I can't prove it.