Conditional independence implies
$$p(A, B \, | \, C) = p(A \, | \, C)\:p(B \, | \, C).$$
If the two sides are equal, then $A$, and $B$ are independent conditional on $C$ and there's nothing more to it.
But what if they are roughly equal? Does that suggest anything? If you were to compute conditional independence empirically from a dataset, how could you then decide how close to equality is close enough?
Since independent and uncorrelated are the same for two Bernoulli variables, and since indicator functions for random events are Bernoulli, the simplest answer is to get a confidence interval (or a credible interval, if you're a Bayesian) for the correlation of $I_A$ with $I_B$ and see whether it includes $0$.