I am using the Alternating Direction Method of Multipliers (ADMM) to solve the total variation (TV) regularization problem in order to denoise images. The TV problem I am considering is the following:
$$\underset{x}{\arg\min}\ \underbrace{\frac{1}{2}\|Ax-b\|_2^2}_{f(x)}+\underbrace{\lambda\sum_{i=1}^{N}\|D_i x\|_2}_{g(x)},$$
where $D_ix\in\mathbb{R}^2$ denotes the first order finite difference of the pixel $i$ in the horizontal and vertical direction. $N$ denotes the total number of pixels in the image.
Defining $z_i=D_ix$ yields the equivalent problem
$$\underset{x,z}{\arg\min}\ \frac{1}{2}\|Ax-b\|_2^2+\lambda\sum_{i=1}^{N}\|z_i\|_2,\ \text{subject to }z_i=D_ix,\ i=1,...,N.$$
Solving the TV regularization problem using ADMM is done by considering the augmented Lagrangian
$$\mathcal{L}_{\rho}(x,z,y)=f(x)+g(z)+\sum_{i=1}^Ny_i^T(z_i-D_ix)+\frac{\rho}{2}\sum_{i=1}^N\|z_i-D_ix\|_2^2,$$
where $y_i^T\in\mathbb{R}^2$ denotes a Lagrange multiplier (it is also known as the dual variable). To solve the problem, we sequentially optimize $\mathcal{L}_{\rho}$ with respect to $x$, then $z$, and finally we make a dual variable update.
In order to obtain convergence with ADMM, I have shown that $f$ and $g$ are closed, proper and convex, which is one of the assumptions of ADMM. However, I also need to show that the unagumented Lagrangian
$$\mathcal{L}_0(x,z,y)=f(x)+g(z)+\sum_{i=1}^Ny_i^T(z_i-D_ix)$$
has a saddle point. This I find quite challenging to do, as regular methods such as considering the Hessian is not so straight forward, since the TV functional is not differentiable for $z_i=0$.
Can anyone give me a hint in terms of how to go about showing this?
Thanks in advance.