I know that $P(AB) = P(A)P(B) \land P(BC) = P(B)P(C) \land P(AC) = P(A)P(C)$ does not imply $P(ABC) = P(A)P(B)P(C)$.
But does $P(ABC) = P(A)P(BC) = P(B)P(AC) = P(C)P(AB)$ imply $P(ABC) = P(A)P(B)P(C)$?
I don't see a way to prove it, I think the product rule alone is not enough, but whatever I do I seem to be running in cycles.
I did notice that $P(C|A) = P(C) \land P(C|B) = P(C)$ does not follow from $P(C|AB) = P(C)$. The counterexample is to make $A$ and $B$ nudge the posterior in opposite directions. For example, let $P(A) = P(B) = P(D) = 0.5$, and $C =$ "at least two of $A, B, D$ are true". Now knowing $A$ and $\lnot B$ doesn't tell us anything $C$, but knowing $\lnot B$ or $A$ alone does.
I wrote an experiment in python that makes me believe the implication is true. It generates random distributions, and enforces the above equalities for a number of iterations (it seems to converge quite quickly):
import numpy as np
r = 0.0
for itit in range(100):
P = np.random.random(size=(2,2,2))
s = np.sum(P)
P = P / s
for it in range(50):
for i in range(2):
for j in range(2):
for k in range(2):
pi = P[i,:,:].sum()
pj = P[:,j,:].sum()
pk = P[:,:,k].sum()
pij = P[i,j,:].sum()
pik = P[i,:,k].sum()
pjk = P[:,j,k].sum()
P[i,j,k] = (pi * pjk + pj * pik + pk * pij) / 3.0
P = P / np.sum(P)
r = max(r, abs(P[1,:,:].sum() * P[:,1,:].sum() - P[1,1,:].sum()))
print r
Still, maybe it's just a very unlikely scenario to get it to converge to an interesting distribution. Is there a proof? If not, does anyone see a counterexample?
Counter-example:
Throw a fair die and define events:
\begin{align} A &= \{1,2,3,4\} \\ B &= \{2,3,4,5\} \\ C &= \{1,2,3,5\}. \\ \end{align}
Then
\begin{align} & P(ABC)=1/3 \\ & P(AB)=P(AC)=P(BC)=1/2 \\ & P(A)=P(B)=P(C)=2/3 \\ & \\ \text{So}\quad &P(ABC)=P(A)P(BC)=P(B)P(AC)=P(C)P(AB) \\ \text{But}\quad &P(ABC)=1/3\neq 8/27 = P(A)P(B)P(C). \end{align}