Representing pairwise-independent but not independent occurrences with venn diagram

1k Views Asked by At

For $A,B$ and $ C $ partially pairwise independent occurrences (i.e. $I(A;B)=0$, $I(A;C)=0$ ), it is not true to say that $I(A;B,C)=0$, since $I(A;B,C)=I(A;B)+I(A;B|C)$ [<-this is not correct, see edit] and we have no information about $I(A;B|C)$.

If i had to use venn diagram on the above case, i would use this diagram, in which $I(A;B,C)=0$.

my venn interpertation

I use venn diagram often when I need to prove/disprove equalities and inequalities in information measures, but my intuition seems to mislead me here.. So my question is - is it possible to draw an appropriate diagram for this case? or should i stop using such diagrams since they do not fit all cases?

EDIT: i have made a mistake when i wrote down my question here, possibly misleading some of the answers. the correct identity is $I(A;B,C)=I(A;B)+I(A;C|B)$ (note its A;C|B and not A;B|C). when wrote in the correct way, it is possible to use the inequality ("conditional reduces entropy") and get $I(A;B,C)=I(A;B)+I(A;C|B)\le I(A;B)+I(A;C)=0$ (the last equality comes form the question terms). this only confuses me more, since i did not expect $I(A;B,C)$ to be zero (again, pairwise independent between A,B and A,C).

EDIT2: Im not sure about this inequality $I(A;B)+I(A;C|B)\le I(A;B)+I(A;C)$. if anyone can help out, ill be glad. I will update when i will be sure what happens here.

3

There are 3 best solutions below

26
On

In this chat discussion, we have obtained a case: Improved case diagram

Here, $B$ and $C$ are independent random variables which can each take a value of $0$ or $1$ with equal probability. It is clear that $B$ and $C$ each have 1 bit of entropy. Let the bits be represented by fair coins.

Here, $A$ is defined as $B\ \text{XOR}\ C$. It is clear that $A$ is determined by both coins, so in the diagram $A$ must include both $B$ and $C$.

But, it has been shown that $A$ only has $1$ bit of entropy. Therefore, the grey space (that is, the space in $A$ but outside $B$ and $C$) contains $-1$ bits of entropy, which makes the Venn diagram invalid.

Therefore, there exists a case for which no Venn diagram is appropriate.

This is a counter-example to the following statement:

For any three random variables $A,B,C$ such that I$(A;B)=0$ and $I(A;C)=0$, a Venn diagram that represents $A$, $B$, and $C$ can be drawn according to entropy.

2
On

I'm new in information theory, but here's my opinion after reading your discussion and links.
I think the point is that in a three-set Venn diagram, the overlapping part of $A$ and $B$ isn't $I(A;B)$ but $I(A;B|C)$ which may not equal to $I(A;B)$.
And $ I(A;B) = 0 $ doesn't imply $ I(A;B|C) = 0 $ since the inequality $ I(A;B|C) \leq I(A;B) $ is under the condition of $ X \to Y \to Z $
Thus, returning to your question, it might be impossible to draw an appropriate diagram for this case unless you have the information about $I(A;B|C)$, in my opinion.

5
On

From wikipedia:

Conditioning on a third random variable may either increase or decrease the mutual information

I believe this fact prevents the use of venn diagram in such cases where it is relevant, since (as the previous answer and also myself stated) in the diagram it is clear that $I(A;B) \ge I(A;B|C)$, but this is actually NOT true.

EDIT: I am accepting my own answer since I was not convinced that @user351579 is 100% correct, and honestly - knowing that Venn diagram does not work in ALL cases is an answer i am satisfied with. For a deeper discussion of ways around this, see my conversation\chat with @user351579 and\or his answer.