Let $X$, $Y$, $Z$ be random variables. I learn from theorem 17.2 of the book All of Statistics by Larry Wasserman that under the assumption that the probability of the joint distribution $(X,Y,Z)$ is everywhere positive, $X \perp Y \mid Z$ and $X \perp Z \mid Y$ implies $X \perp \{Y, Z\}$. But I could not prove it. Could anyone help?
$X \perp Y \mid Z$ means $X$ is independent of $Y$ conditionally on $Z$, similarly $X \perp Z \mid Y$. $X \perp \{Y, Z\}$ means $X$ is independent of both $Y$ and $Z$.
In the book itself, the assumption is termed "all events have positive probability". Since it might be confusing where "events" come from, I translate it as the "the probability of the joint distribution $(X,Y,Z)$ is everywhere positive".
I found the question to be identical to Show that $X \perp Y|Z,W$ and $X \perp W | Z,Y$ implies $X \perp Y|Z$ and $X \perp W|Z$. The proof looks like this.
Since $X \perp Y \mid Z$ and $X \perp Z \mid Y$, $P(X\mid Y,Z)$ can be written in two ways,
\begin{align} P(X\mid Y,Z) = P(X \mid Y) = P(X \mid Z) \end{align}
Using the above equality, \begin{align} P(X)P(Y) &= \int_{Z}P(Z)P(X\mid Z)P(Y) \\ &= \int_{Z}P(Z)P(X\mid Y)P(Y) \\ &=P(X,Y) \end{align}
Thus, $X \perp Y$. $X \perp Z$ can be shown in the same way.