Consider RVs $X, Y_1, Y_2$ where $Y_1$ and $Y_2$ are conditionally iid given $X$. I want to show that $I(X;Y_1, Y_2) = 2 I(X;Y_1) - I(Y_1;Y_2)$.
For this purpose I started to rewrite:
$I(X;Y_1, Y_2) = \sum_{x,y_1,y_2} p(x,y_1,y_2) \log \frac{p(x,y_1,y_2)}{p(x) p(y_1,y_2)} \\ =\sum_{x,y_1,y_2} p(x) p(y_1,y_2|x) \log \frac{p(x,y_1,y_2)}{p(x) p(y_1,y_2)} \\ = \sum_{x,y_1,y_2} p(x, y_1) p(y_2|x,y_1) \log \frac{p(x)p(y_1|x) p(y_2|y_1,x)}{p(x) p(y_1,y_2)} \\ = \sum_{x,y_1,y_2} p(x,y_1) p(y_2|x,y_1) \log \frac{p(y_1|x)p(y_2|y_1,x)}{p(y_1,y_2)} \\ = \sum_{x,y_1,y_2} p(x,y_1) p(y_2|x,y_1) \log \frac{p(y_1,x)}{p(x)p(y_1)} \frac{p(y_2|y_1,x)}{p(y_1,y_2)} \\ = \sum_{x,y_1,y_2} p(x,y_1) p(y_2|x,y_1) [\log \frac{p(y_1,x)}{p(x)p(y_1)} - \log \frac{p(y_1,y_2)}{p(y_2|y_1,x)}] \\ = \sum_{x,y_1,y_2} p(x,y_1) p(y_2|x,y_1) \log \frac{p(y_1,x)}{p(x)p(y_1)} - \sum_{x,y_1,y_2} p(x,y_1) p(y_2|x,y_1) \log \frac{p(y_1,y_2)}{p(y_2|y_1,x)}$
Now I don't know how to go on, because in the end I need to have something like:
$ ... = \sum_{x,y_1} p(x,y_1) \log \frac{p(x,y_1)}{p(x)p(y_1)} + \sum_{x,y_1} p(x,y_1) \log \frac{p(x,y_1)}{p(x)p(y_1)} - \sum_{y_1,y_2} p(y_1,y_2) \log \frac{p(y_1,y_2)}{p(y_1)p(y_2)}\\ = I(X;Y_1) + I(X;Y_1) - I(Y_1;Y_2) \\ = 2 I(X;Y_1) - I(Y_1;Y_2)$
This type of problem is more easily solved by directly manipulating mutual information and/or entropy quantities, rather than operating on pdf-based expressions. Of course, both approaches are equivalent as the properties of mutual information and entropy follow by manipulation of their pdf representations. However, there is no need to "reinvent the wheel".
By repeated application of the chain rule of mutual information, it holds $$ \begin{align} I(X; Y_1, Y_2) &= I(X;Y_1) + I(X;Y_2\mid Y_1)\\ &=I(X;Y_1)+I(X,Y_1;Y_2)-I(Y_1;Y_2) \\ &=I(X;Y_1)+I(X;Y_2)+I(Y_1;Y_2\mid X)-I(Y_1;Y_2)\\ &\stackrel{(a)}{=}I(X;Y_1)+I(X;Y_2)-I(Y_1;Y_2)\\ &\stackrel{(b)}{=}I(X;Y_1)+I(X;Y_1)-I(Y_1;Y_2), \end{align} $$
where $(a)$ follows since $I(Y_1;Y_2\mid X) = 0$ as $Y_1$ and $Y_2$ are conditionally independent and $(b)$ since $I(X;Y_2)=I(X;Y_1)$ as $Y_1$ and $Y_2$ have the same distribution when conditioned on $X$.
As an exercise, I would recommend attempting to prove this relation starting from the well known formula $$ I(X;Y_1,Y_2)=h(Y_1,Y_2)-h(Y_1,Y_2\mid X) $$ and using properties of entropies. (Again, the chain rule is essential, as well as exploiting the fact that $Y_1,Y_2$ are i.i.d. given $X$.)