I want to prove the following inequality (It should be true):
A,B,C are independent random variables and U is given by a law $P_{UABC}=P_{ABC}\cdot P_{U|ABC}$. Then: $$ I(U;BC)+I(U;CA)+I(U;AB)\leq 2I(U;ABC)$$
I guess it is not that difficult to prove, and I guess the proof is by step: First bound $I(U;CA)+I(U;AB)$ with ..., then bound $I(U;BC)+...$, then bound with what we want (this intuition comes from the proof of an equivalent problem, where we have those 3 steps and which is not really difficult). But as I'm not used to mutual info, I don't know how to do this...
Remember that if you're not used to mutual information, you can always just deal with entropies, since $I(X;Y) = H(X) - H(X|Y)$. I'll use this method below.
Okay, we want to exploit the independence of $A,B,C,$ since that's the only information we have. Recall that for independent $X,Y, H(X,Y) = H(X) + H(Y)$. To utilise this, we'll open up each $I(U;\cdot)$ as $H(\cdot) - H(\cdot|U)$. On doing this and repeatedly utilising the additivity of entropy for independent random variables, the required inequality is equivalent to
\begin{align} 2H(ABC|U) &\overset{?}{\le} H(AB|U) + H(BC|U) + H(CA|U) \\\iff H(C|UAB) + H(A|UBC) &\overset{?}{\le} H(C|U) + H(A|UC) \end{align}
where in the second line we've used the chain rule $H(ABC|U) = H(AB|U) + H(C|ABU)$ and other similar expressions. Now note that the final inequality holds, since conditioning reduces entropy, and so $H(C|UAB) \le H(C|U)$ and $H(A|UBC) \le H(A|UC)$.