From the Wikipedia article http://en.wikipedia.org/wiki/Multinomial_logistic_regression:
$ln \frac{\Pr(Y_i=1)}{\Pr(Y_i=K)} = \beta_1 \cdot \mathbf{X}_i $
$ln \frac{\Pr(Y_i=2)}{\Pr(Y_i=K)} = \beta_2 \cdots \mathbf{X}_i$
$\vdots$ $~~~~~~~~~~~~~~~~~~~~~\vdots$
$ln \frac{\Pr(Y_i=K-1)}{\Pr(Y_i=K)} = \boldsymbol\beta_{K-1} \mathbf{X}_i $
Exponentiate both sides and solve for the probabilities, you get:
$Pr(Y = j) = Pr(Y =k)e^{B_jx_i}$, for $j=1\dots K-1$.
Using the fact that the sum of all $K$ probabilities sum to 1, you get:
(*) $ \Pr(Y_i=K) = \frac{1}{1 + \sum_{k=1}^{K-1}e^{\beta_kx_i}}$
I don't understand the steps taken to arrive at (*).
Adding up $Pr(Y_i = j) = Pr(Y_i = K) e^{B_j {\bf X}_i} $ for $j=1\dots K-1$, and $Pr(Y_i=K) = Pr(Y_i = K) e^{B_K {\bf X_i}} $ where $B_K = 0$ you will get
$$ 1 = Pr(Y_i=K) \sum_{j=0}^K e^{B_j {\bf X_i}} $$ which is the same as (*) but no log.