I suppose that I have a multinomial random variables $V$ with $n$ states and a set of $m$ boolean variables $B_i$.
The graphical representation of the system is a naive bayesian network with $V$ at the center and a set of variables $B_i$ having $V$ as unique parent.
This graphical representations implies the conditional independences : $\forall i \lt j , I(B_i,B_j | V)$ i.e. knowing V, $B_i$ and $B_j$ are independent.
The probabiliy distribution is fully described by the marginal probability of $V$ and by the conditional probabilities $P(B_i/V)$
I also want my system to verify independence property between every pair of $B_i$ (those independence are not implied by the graphical representation of my Bayesian network).
So, to sum it up, I have :
$\forall i \lt j , I(B_i,B_j)$
$\forall i \lt j , I(B_i,B_j | V)$
$\forall i , \lnot I(B_i,V)$
My question is : how many $B_i$ can we have depending on n ?
My idea is that mutual independence provides constraints for each pair $(i,j)$. There are $\frac{m(m-1)}{2}$ such pairs where m is the size of the $B_i$ set. The dimension of all the conditional probability matrices is $m.n$. So the number of constraints grows faster with $m$ than the dimension. So it seems that m should be limited. Anyway that is far from being a rigorous demonstration.
for $n=2$ it is not possible to have two variables $B_1$ and $B_2$. Knowing $B_1$ modifies the probability for $V$ that, in turn, modify that of $B_2$. So $B_1$ and $B_2$ can not be independent.
But for $n \ge 3$, can $m$ be infinite and, otherwise, what would be its maximum ?