Suppose 3 $n$-length random vectors form a Markov chain: $\mathbf{X} \rightarrow \mathbf{Y} \rightarrow \mathbf{Z}$. Moreover, $Y_{i}$ is a deterministic function of $Z_{i}$ (ie. $Y_{i}=g_{i}(Z_{i})$).
Does the inequality \begin{align} I(\mathbf{X};\mathbf{Z}) \leq I(\mathbf{X};\mathbf{Y}) \end{align} still hold? If not, what are the other properties that need to be satisfied in order for the above inequality to be true?
I tried the following approach:
\begin{align} I(\mathbf{X};\mathbf{Z}) =& \sum_{i = 1}^{n} I(\mathbf{X};Z_i|Z_{1}^{i-1})\\ = & \sum_{i = 1}^{n} H(\mathbf{X}|Z_{1}^{i-1}) - H(\mathbf{X}|Z_{i},Z_{1}^{i-1})\\ \leq & \sum_{i = 1}^{n} H(\mathbf{X}|Z_{1}^{i-1}) - H(\mathbf{X}|Y_{i},Z_{i},Z_{1}^{i-1})\\ = & \sum_{i = 1}^{n} H(\mathbf{X}|Z_{1}^{i-1}) - H(\mathbf{X}|Y_{i},Z_{1}^{i-1})\\ = & \sum_{i = 1}^{n} I(\mathbf{X};Y_{i}|Z_{1}^{i-1}). \end{align}
However, I'm stuck and not sure how to proceed with this (or if there are other conditions that need to be satisfied in order for the inequality to be true). Any help is appreciated.
The definitions of entropy, mutual information, etc for discrete random variables are not specific to one-dimensional (scalar) quantities. They apply also to multi-dimensional variables, with no changes. Hence it's true that $$ \begin{align} \mathbf{X} \rightarrow \mathbf{Y} \rightarrow \mathbf{Z} \implies I(\mathbf{X};\mathbf{Z}) \leq I(\mathbf{X};\mathbf{Y}) \end{align} $$