$$\min_{\theta_i} \sum_{i=1}^K - \alpha_i \log(\theta_i)$$ s.t. $$1 \geq \theta_i \geq 0$$ $$\sum_{i=1}^K \theta_i = 1$$ $\alpha_i \geq 0$, but not all $\alpha_i$ is 0.
I know that for $K=2$, $\theta^*_i = \frac{\alpha_i}{\sum_{j=1}^K \alpha_j}$. But, I am not sure if the same expression holds for K>2. Even if it does, how I can show it?
Short answer : yes.
Long answer:
Let us set $f(\theta_1,\theta_2, ..., \theta_K):=\sum_{i=1}^K - \alpha_i \log(\theta_i)$ and for simplicity $\alpha_i \neq 0$ for all $i$.
Let us set $g(\theta_1,\theta_2, ..., \theta_K):=\sum_{i=1}^K \theta_i $ Furthermore let us define a constant $c:=1$ Then the problem has the format which can be solved with the Method of Lagrange Multipliers: $$\min_{g(x)=c} f(x)$$
Acording to Lagrange there is a local minimum $x$ for that problem and a constant $\lambda$ and such that
$$ \nabla f(x) = \lambda \nabla g(x) \\ g(x) =c $$
So we get (note that $\lambda$ cannot be 0) $$ - \alpha_i / \theta_i = \lambda \qquad(i=1,..,K)\\ \theta_i = - \alpha_i /\lambda \qquad(i=1,..,K) $$ Summing the last equations we get $$1=\sum_{i=1}^K \theta_i=-\lambda /\sum_{i=1}^K \alpha_i$$ so $\lambda=-\sum_{i=1}^K \alpha$. If we now substitute $\lambda$ in $- \alpha_i / \theta_i = \lambda$ we get $$\theta_i =\alpha_i/\sum_{j=1}^K \alpha_j$$
Since $f$ is a sum of convex function it is convex itself and therefore the local minimum is also a global minimum.