$I$ is mutual information.
\begin{align*} I(X, Y ; Z) &= D_{KL} \left( P_{XYZ} || P_{XY} \otimes P_Z \right)\\ &= \sum_{x, y, z} P_{XYZ}(x, y, z) \log \left( \frac{P_{XYZ}(x, y, z)}{P_{XY}(x, y) P_Z(z)} \right) \\ &= \sum_{x, y, z} P_Y(y) P_{XZ|Y}(x, z | y) \log \left( \frac{ P_{XZ|Y}(x, z | y)}{ P_{X|Y}(x|y) P_Z(z)} \right) \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, (1) \end{align*}
and \begin{align*} I(X; Z) &= \sum_{x, z} P_{XZ}(x, z) \log \left( \frac{P_{XZ}(x, z)}{P_X(x) P_Z(z)} \right) \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, (2) \end{align*}
I need a necessary and sufficient condition for equality between $(1)$ and $(2)$.
It seems to me a sufficient condition would be:
$$\text{(X, Z) is independent of $Y$} \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, \text{(stat)}$$
Furthermore, intuitively, we know that the quantity $I(X, Y ; Z)$ means the amount of information $(X, Y)$ convey about $Z$
But if $X = Y$ , then $X$ conveys the same amount of information about $Z$ that $(X, Y)$ together do so
$$I(X, Y ; Z) = I(X; Z) \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, \text{if } X = Y \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, \text{(stat2)}$$
So I believe the conditions in $\text{(stat)}$ and $\text{(stat2)}$ are sufficient but not necessary.
The chain rule for mutual information is $$I(X_1, X_2, ..., X_n; Y) = I(X_1; Y) + \sum_{i = 2}^n I(X_i;Y | X_{1}, ..., X_{i-1})$$
Applying this to $I(X, Y; Z)$, we get: $$I(X, Y; Z) = I(X; Z) + I(Y; Z | X)$$
Hence, $$I(X, Y; Z) = I(X; Z) \,\, \iff \,\, I(Y; Z| X) = 0$$
In words, $I(Y; Z | X) = 0$ means
In Markov chain formulation, $I(Y; Z | X) = 0$ $\iff$ $X, Y, Z$ are related by the Markov chain $$Y - X - Z$$
The conditions
as mentioned in the original question are both sufficient conditions and both imply $I(Y; Z | X) = 0$.
But the necessary and sufficient condition is $I(Y; Z | X) = 0$.