This is the whole assignment text, for context:
note on the notation: H is the entropy, I is the mutual information
Let $(\Omega,\mathcal{A}, P)$ be a probability space with $A,B \in \mathcal{A}, A \cap B = \emptyset$ and $P(A) = P(B) = \frac{1}{4}$. Let $X,Y : \Omega \rightarrow \{0,1,-1\}$ be random variables defined by
$\begin{gather*}X(\omega) = \begin{cases} 1 & \omega \in A \\ -1 & \omega \in B \\ 0 & else \end{cases} \end{gather*}$ and $\begin{gather*}Y(\omega) = \begin{cases} -1 & \omega \in A \\ 1 & \omega \in B \\ 0 & else \end{cases} \end{gather*}$
(a) Why are $X$ and $Y$ not independent ?
(b) Show that $H(X)=H(Y)=H(X,Y)=I(X;Y)=\frac{3}{2}$ and $H(X|Y)=H(Y|X) = 0$.
(c) Let $Z := XY$. Show that $H(Z) < H(X,Y) = H(X,Y,Z)$ and $H(Z|X)=0$ but $H(X|Z)>0$.
(d) Calculate $H(X|Z)$ and show that $H(X|Z)=I(X;Y|Z)$.
I need help answering (c). For (a) and (b) I wrote down the joint probability of $X$ and $Y$ in a table:
$ \begin{array}{|c|c|c|c|} \hline P(X=x,Y=y) & -1 & 0 & 1 \\\hline -1 & 0 & 0 & \frac{1}{4} \\\hline 0 & 0 & \frac{1}{2} & 0 \\\hline 1 & \frac{1}{4} & 0 & 0 \\\hline \end{array} $
With this I can easily calculate all things asked in (a) and (b), but I do not understand the new random variable $Z:=XY$ and its meaning with respect to $X$ and $Y$. Can someone explain this to me?