Please refer to Boyd et al.'s convergence analysis of ADMM (Chapter 3 and Appendix A).
My question is: Why do we need $f$ and $g$ to be convex?
I don't see the need of this assumption. If the convexity assumption is removed, the analysis is still valid.
Thanks.
P/s: related: Subgradients of non-convex functions
Their proof uses subgradients of a function $f$ and their properties. If $f(x)$ is not convex, then not everywhere can you define its subgradient. Think about $f(x) = -x^2$, which is a concave function and not convex; the gradient $f'(x) = -2x$ is not a subgradient (sounds weird, doesn't it?) because $f'(x) = -2x$ does not satisfy the subgradient definition (and its key property) $f(y) \ge f(x) + \langle -2x, y-x\rangle$ for all $y$. In fact this inequality only holds for $y=x$.
Traditionally, ADMM is used for, and was proved to converge with, convex functions. In fact, convexity is not a necessary condition for ADMM to work. Google for "nonconvex ADMM" or "nonconvex nonsmooth ADMM" and you will see some recent papers.