Let f$,g$ be proper closed convex functions, $u=(y,z)$, and the problem: $$ \begin{equation} \begin{split} \min_{x,u} \ &f(x) + g(u) \\ \text{s.t. } & Ax + Bu = c \end{split} \end{equation} $$
We may use the ADMM framework, so that iteration $k$ consists in: $$ \begin{equation} \begin{split} x_{k+1} &= \arg\min_{x} L_p(x,u_k,\lambda_k) \\ u_{k+1} &= \arg\min_{u} L_p(x_{k+1},u,\lambda_k) \\ \lambda_{k+1} &= \lambda_k + p ( Ax_{k+1} + Bu_{k+1} - c ) \end{split} \end{equation} $$ where $ L_p(x,u,\lambda) = f(x) + g(u) + \langle \lambda, Ax + Bu - c \rangle + \frac{p}{2} \| Ax + Bu - c \|_2^2 $ is the Augmented Lagrangian associated to the problem.
However, assume that the subproblem for $u$ is difficult to solve. Let us split the $u$-update with a pass of Block-Coordinate Descent (BCD) in $y,z$ to get easy to solve subproblems: $$ \begin{equation} \begin{split} x_{k+1} &= \arg\min_{x} L_p(x,y_k,z_k,\lambda_k) \\ y_{k+1} &= \arg\min_{y} L_p(x_{k+1},y,z_k,\lambda_k) \\ z_{k+1} &= \arg\min_{z} L_p(x_{k+1},y_{k+1},z,\lambda_k) \\ \lambda_{k+1} &= \lambda_k + p ( Ax_{k+1} + Bu_{k+1} - c ) \end{split} \end{equation} $$
We separated the $u$-update, which should be done exactly. This makes us leave the standard 2-block ADMM.
Also assume the function $g(y,z) \neq g_1 (y) + g_2(z)$ i.e. not separable in the two variables so that the analysis in terms of a 3 block ADMM (see The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent) is not possible.
I've seen the Generalized ADMM (Theorem 8) which allows inexact updates, but it adds a relaxation term to the standard ADMM which I don't want.
I also see that repeated BCD passes would result in a more precise minimization over $u$ but it would be too heavy.