Suppose $A$ and $B$ are jointly independent of $X$ such that $P(A,B|X)=P(A,B)$. Can it be shown that $A$ and $B^c$ are jointly independent of $X$, i.e., that $P(A,B^c|X)=P(A,B^c)$?
Intuition suggests this is true, following from the fact that if $B$ and $X$ are independent, then so are $B^c$ and $X$ (e.g., as shown here). However, I am struggling to prove it with math, and I wondered if you might offer help. Thank you.
If I am understanding the notation, then I believe that the hypothesis is false. I am assuming that $P(A,B)$ is the probability that both event $A$ and event $B$ are true and that $P(A, B|X)= P(A,B,X)/P(X)$.
Let $Y$ be a randomly rolled 6 sided die. Let $A$ be the event that $Y$ is a 1,2, or 3. Let $X$ be the event that $Y$ is not 1. And, let the $B$ event be always false. Then
$$P(A,B|X) = 0 = P(A,B),$$ $$P(A, B^C)= P(A) = 1/2, \quad\mathrm{\ and}$$ $$P(A, B^C|X) = P(A|X) = 2/5.$$