Let $X,Y,Z$ be random variables. Assume we know the conditional entropies $H(Y|X)$ and $H(Z|Y)$ but we want to bound $H(Z|X)$ which is unknown
Is there any relation between these 3 quantities, some "triangular-like" inequality or whatsoever ?
something like $H(Z|X) = H(Z|Y) + H(Y|X) + ...$
Intuitively, I see that if $H(Z|Y)=H(Y|X)=0$, there are deterministic functions $f$ and $g$ such that $Z=f(X)$ and $X=f(Y)$, thus we have $Z=f \circ g(X)$ hence $H(Z|X) = 0$. But apart from this trivial case, I don't find an explicit relation...
thanks !
Notice that
$$\begin{align} H(Y \mid X)+H(Z \mid Y)&= H(Y \mid X) + H(Z \mid Y,X)+ I(X;Z \mid Y) \\ &=H(Y,Z \mid X)+ I(X;Z \mid Y)\\ &=H(Z \mid X)+H(Y \mid X,Z) + I(X;Z \mid Y) \tag{1} \end{align}$$
where we've used $H(Z \mid Y)-H(Z \mid Y,X)=I(X;Z \mid Y)$ and $H(Y,Z \mid X)=H(Z \mid X)+H(Y \mid X,Z)$
Hence the bound from Arash's answer.
If $X\to Y \to Z$ form a Markov chain, then the last term in $(1)$ vanishes. If, besides $Y=g(X,Z)$ then also the second term vanishes and we have an equality.