I want to show that in case $(X,Y,Z)$ is Markovian (and therefore $X$ and $Z$ are conditionally independent), the mutual information $I(X;Z|Y) = 0$.
My thoughts: $I(X;Z|Y) = H(X|Z) - H(X|Y,Z) = H(X,Z) - H(Z) - H(X|Y,Z)$ but I can't go on any further...any ideas?
If $(X,Y, Z)$ is Markovian, then $H(Z|Y,X)=H(Z|Y).$
And just remember that information is symmetric of its arguments, precisely: $$I(X;Y)=I(Y;X).$$
EDIT: to be more precise: $$I(X;Z|Y)=I(Z;X|Y)=H(Z|Y)-H(Z|X, Y)=H(Z|Y)-H(Z|Y)=0$$
And I see it now, that you wrongly defined $I(X;Z|Y)$, because: $$I(X;Z|Y)\neq H(X|Z)-H(X|Y,Z),$$ it is: $$I(X;Z|Y)= H(X|Y)-H(X|Y,Z).$$
And to answer why is $H(Z|X,Y)=H(Z|Y)$ look at: $$P\{X=x, Y=y, Z=z\}=P\{X=x\}\cdot P\{Y=y|X=x\}\cdot P\{Z=z|Y=y\} $$ and try to understand why is it true and how we can from that derive that $X$ and $Z$ are actually independent given the $Y$.