Consider $f(z)$ where $z=(x,y)$ and $x=(x_1,...,x_n)$ and $y=(y_1,...,y_m)$. Now if we would like to anti-symmetrize this function so that $F(z_\sigma)= \text{sign}(\sigma) F(z)$ for any permutation $\sigma \in S_{n+m}$, the standard practice would be to define $$ F(z)= \frac{1}{(n+m)!}\sum_{\sigma \in S_{n+m}} \text{sign} (\sigma) f(z_\sigma) $$
Now, there are definitely special cases where the summation can be simplified, e.g., if $f$ is already anti-symmetric, then we could just define $F(z)=f(z)$. Let us consider the special case where $x\mapsto f$ and $y\mapsto f$ is anti-symmetric separately, that is, $f(x_\sigma, y)=\text{sign}(\sigma) f(x,y)$ for any $\sigma\in S_n$ and similarly for $y$. In this case, the summation can probably be simplified so that it is over permutations with "connected cycles", that is, if you were to uniquely decompose the permutation $\sigma \in S_{n+m}$ into cycles, then there must exist one that connects $x$ and $y$. You can probably simplify even further, since many connected cycles are related by permutations in $x$ or $y$ separately.
In this case, what would be a "minimal" summation for an anti-symmetrization $F$ of $f$? Can this formula be generalized to $f(z)$ where $z=(z^1,...,z^A)$ and $z^a=(z^a_1,...,z^a_{n_a})$?