Let $X$ and $Y$ be jointly distributed RVs on domains $\mathcal{X}$ and $\mathcal{Y}$ respectively, and let $Z = g(X)$ on $\mathcal{X}$.
The problem I am trying to solve is if $H(Y|X) \le H(Y|Z)$ and so far I have that $H(Y|Z) + H(X|Y,Z) \ge H(Y|X)$.
I want to get rid of the $H(X|Y,Z)$ term and I think it should be $0$ but I can't find this property anywhere. I know that $H(Z|X)=0$ but would $H(X|Z)=0$ as well? I was thinking it would be if $g$ is injective, but I'm not sure if this condition would be necessary.
The data processing inequality states that you can not create information by doing arbitrary transformation with data $$I(X : Y) \geq I(f(X), g(Y))$$
The mutual information is defined as
$$I(X : Y) = H(Y) - H(Y | X)$$
Apply definition to your problem
$$H(Y) - H(Y | X) \geq H(Y) - H(Y | g(X))$$
$$H(Y | X) \leq H(Y | g(X))$$
Also, as for your solution, it is indeed the case that conditioning on
$$H(X | Y g(X)) \leq H(X|Y)$$
because knowing a function of a variable itself reduces uncertainty about it. However, since the function can destroy information, that reduction need not completely resolve the uncertainty, so the term, in general, need not be zero. You must have used another inequality somewhere in your proof, which weakened the statement.