$I (X, Y |Z) >I (X,Y)$ can happen, for example $X$, $Y$ independent bits and $Z = X+Y$.
How do you show this fact using Venn diagrams? Conditioning would mean removing the mass of conditioning random variable and thus should reduce mutual information.
The representation of joint-conditional entropies of two variables using Venn diagrams is just an illustration that happens to fit the relevant properties... for two variables. Some authors critize and discourage this representation because, though useful as mnemonic, it looks more meaningful than it really is, and it can be misleading in several ways: specially, it strongly suggests that it should generalize to three variables. Well, it doesn't.
There is no useful/satisfactory way of representing the joint-conditional entropies for three variables using Venn diagrams. See eg the book from MacKay, which precisely uses this example (exercise 8.8).