I have a set of $n$ discrete variables $\mathcal{A} = \{ a_{1}, \dots, a_{n} \}$, where every $a_{i} \in \mathcal{A}$ can take on the values in $ \{ 0, 1, \dots, m \}$ and $n,m \in \mathbb{N}$. I wish to create a joint measure -- an unnormalised probability distribution -- over all joint events described by the variables and then determine a marginal probability distribution over some $a_{i} \in \mathcal{A}$. Each event must obey the following rule: $a_{i} \neq a_{j}$, if $a_{i} = k$ and $k > 0.$ That is to say, multiple variables may equal zero in any event, but every non-zero value must be unique. Here is an example for $n = 3$ and $m = 2$: \begin{array}{|ccc|c|} \hline a_{1} & a_{2} & a_{3} & \phi(a_{1}, a_{2}, a_{3}) \\ \hline 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 2 & 1 \\ 0 & 1 & 0 & 1 \\ 0 & 1 & 2 & 1 \\ 0 & 2 & 0 & 1 \\ 0 & 2 & 1 & 1 \\ 1 & 0 & 0 & 1 \\ 1 & 0 & 2 & 1 \\ 1 & 2 & 0 & 1 \\ 2 & 0 & 0 & 1 \\ 2 & 0 & 1 & 1 \\ 2 & 1 & 0 & 1 \\ \hline & \text{Elsewhere} & & 0\\ \hline \end{array} I wish to create a similar joint measure for an arbitrary $n$ and $m$ and then determine the marginal distribution, $$P(a_{i}) = \frac{1}{K} \sum\limits_{\mathcal{A} - \{a_{i}\} } \phi(\mathcal{A}) \text{,}$$ where $$ K = \sum_{\mathcal{A}} \phi (\mathcal{A}) = \sum_{k=0}^{\min(m, n)} k! \binom{n}{k} \binom{m}{k} $$ is a normalising constant and $\mathcal{A} - \{a_{i}\}$ denotes the set difference. Quite obviously, all marginal distributions over every $a_{j} \in \mathcal{A}$ are equivalent, as the joint measure is symmetrical. Is it possible to find an expression for this marginal distribution, as I have done for the normalising constant?
Edit. I believe the marginal distribution is given by $$ P(a_{i} = 0) = \frac{1}{K} \sum\limits_{k=0}^{\min(m, n-1)} k! \binom{n-1}{k} \binom{m}{k} $$ and $$ P(a_{i} \neq 0) = \frac{1}{K} \sum\limits_{k=0}^{\min(m-1, n-1)} k! \binom{n-1}{k} \binom{m-1}{k} \text{;}$$ however, I would like to show that $$\sum_{j=0}^{m} P(a_{i} = j) = 1 \text{.}$$
Here's how you prove this is a probability distribution. I have removed the upper limits of the summations; this is allowed since all subsequent terms are zero. \begin{align} \sum_{j=0}^m P(a_i=j) &=P(a_i=0)+mP(a_i\neq 0) \\&=\frac{1}{K} \sum_{k\ge 0} k! \binom{n-1}{k} \binom{m}{k}+m\cdot\frac1K\sum_{k\ge 0}k!\binom{n-1}k\binom{m-1}k. \end{align} Multiplying by $K$, \begin{align} K\sum_{j=0}^m P(a_i=j) &= \sum_{k\ge 0} k! \binom{n-1}{k} \binom{m}{k}+\sum_{k\ge 0}k!\binom{n-1}k\cdot m\cdot \binom{m-1}k \\ &= \sum_{k\ge 0} k! \binom{n-1}{k} \binom{m}{k}+\sum_{k\ge 0}k!\binom{n-1}k\cdot (k+1)\binom{m}{k+1} \\ &= \sum_{k\ge 0} k! \binom{n-1}{k} \binom{m}{k}+\sum_{k\ge 0}(k+1)!\binom{n-1}k\binom{m}{k+1} \\ &= \sum_{k\ge 0} k! \binom{n-1}{k} \binom{m}{k}+\sum_{k\ge 1}k!\binom{n-1}{k-1}\binom{m}{k} \\ &= 0!\binom{n-1}0\binom{m}0+\sum_{k\ge 1} k! \left[\binom{n-1}{k}+ \binom{n-1}{k-1}\right]\binom{m}{k} \\ &= 0!\binom{n-1}0\binom{m}0+\sum_{k\ge 1} k! \binom{n}{k}\binom{m}{k} \\ &= \sum_{k\ge 0} k! \binom{n}{k}\binom{m}{k} \\&=K. \end{align} Now divide by $K$.