Consider the following Markov chain:
\begin{equation} X^n \rightarrow W \rightarrow Y^n \end{equation}
The notation $X^n$ is read as $X^n := (X_1, X_2, \ldots, X_n)$. Let $W$ take values in a set of cardinality $2^{nR}$ and let $X_1, X_2, \ldots, X_n$ be distributed independently according to $X$. I am interested in whether the following statement is true:
\begin{equation} \max_i I(X_i;Y_i) \leq R \end{equation}
Note that proving
\begin{equation} \min_i I(X_i;Y_i) \leq R \end{equation}
is fairly straight-forward. First, by means of the data processing inequality, we can argue that
\begin{equation} I(X^n;Y^n) \leq I(X^n;W) \leq H(W) \leq nR \end{equation}
After that it suffices to produce a lower bound on $I(X^n;Y^n)$ as follows:
\begin{align} I(X^n;Y^n) &= H(X^n) - H(X^n \mid Y^n) \\ &= nH(X) - H(X^n \mid Y^n) \\ &= nH(X) - \sum_{i=1}^n H(X_i \mid X^{i-1}, Y^n) \\ &\geq nH(X) - \sum_{i=1}^n H(X_i \mid Y_i) \\ &= \sum_{i=1}^n I(X_i;Y_i) \\ &\geq n \min_i I(X_i;Y_i) \end{align}
Thus, $nR \geq n \min_i I(X_i;Y_i)$ and the second statement follows. Can we produce a similar result for the first claim?