Let $A, B : V \rightarrow V$ be Hermitian operators on a $K$ dimensional vector space $V$ such that $[A, B] = 0$. Let $\lambda_1,\ldots,\lambda_N$ denote the eigenvalues of $A$, and let $E_A^{\lambda_i}$ denote the corresponding eigenspaces. Similarly, let $\gamma_1,\ldots, \gamma_M$ denote the eigenvalues of $B$, and let $E_B^{\gamma_j}$ denote the corresponding eigenspaces.
Since $A$ and $B$ commute, there exists a basis of orthonormal eigenvectors which diagonalizes both $A$ and $B$. In this basis, we can define the simultaneous eigenspaces $E_{A,B}^{\lambda_i,\gamma_j}$, which have the property
$$v\in E_{A,B}^{\lambda_i,\gamma_j} \implies Av = \lambda_i v \text{ and } Bv = \gamma_jv.$$
It is clear that the direct sum of all of the simultaneous eigenspaces is $V$.
My question is: is there any nice relationship between the dimension of $E_{A,B}^{\lambda_i,\gamma_j}$ and the dimensions of the eigenspaces $\{E_A^{\lambda_i}\}, \{E_B^{\gamma_j}\}$? Ideally, I would like to write $\text{dim}\left(E_{A,B}^{\lambda_i,\gamma_j}\right)$ in terms of $\left\{\text{dim}\left(E_A^{\lambda_i}\right)\right\}$ and $\left\{\text{dim}\left(E_B^{\gamma_j}\right)\right\}$. After exploring this question for a while, the best I can come with is:
$$ \text{dim}\left(E_A^{\lambda_i}\right) = \sum_{j=1}^{M} \text{dim}\left(E_{A,B}^{\lambda_i,\gamma_j}\right),$$
but this is the inverse of what I would like.
Also, it is interesting to me that the above equation resembles the Law of Total Probability
$$P(X_i) = \sum_{j} P(X_i \cap Y_j),$$
where probabilities $P$ are equated with dimensions and events $\{X_i\}$, $\{Y_j\}$ are equated with eigenspaces, and this almost resembles the use of probability theory in Quantum mechanics. So perhaps this question is equivalent to the question of inverting the Law of total probability to write $P(X_i \cap Y_j)$ in terms of $\{P(X_i)\}$ and $\{P(Y_j)\}$. I'm not familiar with probability theory, so I am not sure if such an inversion exists.
After thinking about this some more, I believe the best one can come up with is: $$ \max\left\{0, \dim(E_A^{\lambda_i})+\dim(E_B^{\gamma_j}) - \dim(V))\right\} \leq \dim(E_{A,B}^{\lambda_i,\gamma_j})\leq \min\left\{\dim(E_A^{\lambda_i}), \dim(E_B^{\gamma_j})\right\}.$$ The upper bound is obvious. It is true in general that for subspaces $U,W$, the dimension of $U\cap W$ is less than or equal to the dimension of both $U$ and $W$, and $E_{A,B}^{\lambda_i,\gamma_j} = E_{A}^{\lambda_i} \cap E_B^{\gamma_j}$.
The lower bound follows from a pigeonhole-principle-like argument on the simultaneous eigenvectors of $A$ and $B$. Fix an orthonormal basis of $V$ that diagonalizes both $A$ and $B$. Let $a_1,\ldots,a_n$ be the eigenvectors that span $E_A^{\lambda_i}$, and let $b_1,\ldots,b_m$ be the eigenvectors that span $E_B^{\gamma_j}$. If $n + m > \dim(V)$, then some of these eigenvectors must coincide, or else we'll have $\dim(E_A^{\lambda_i}+E_B^{\gamma_j}) > \dim(V)$, which is a contradiction. The minimum number of eigenvectors that must coincide is $n+m-\dim(V)$. In the case that $n + m \leq \dim(V)$, it is possible for none of the eigenvectors to coincide, in which case the dimension of the simultaneous eigenspace is $0$.