Logistic regressions were taught to me using the intuition that they approximate $\mathbb{P}(Y=y|x;\theta)$. Multiclass regressions use one-vs-all classification, selecting one $y$ and classifying $Y=y$ vs $Y\neq y$, over all possible $y$.
Would $\sum_i\mathbb{P}(Y=y_i|x;\theta)$ obey the second Kolomogorov axiom of probability (i.e. $\mathbb{P}(\Omega)=1$)?
To answer my own question, the answer is no. To check, I simply ran a multiclass logistic regression and summed the values, some were greater than 1, some were less than.
I feel as though the reason behind this is that the logistic function does not provide an empirical probability, but is actually used to provide values bounded between 0 and 1. These values are still useful, however, as more likely values still have a higher value provided, and thus the classification is still generally accurate.