The general method of Projection on Convex Sets (POCS) can be used to find a point in the intersection of a number of convex sets i.e.
$$ \text{find } x \in \mathbb{R}^N \text{ s.t. } x \in \bigcap_i C_i $$
This method can find any feasible point in the intersection of the convex sets. Now my question is: Is there a similar method that can find a point $x^*$ that has minimum norm instead i.e. solve
$$ x^* = \arg \min_{x \in \mathbb{R}^N} \Vert x\Vert \text{ s.t. } x \in \bigcap_i C_i$$
or more generally find the projection of a point $a \in \mathbb{R}^N$ onto the intersection of the convex sets i.e. solve
$$ x^* = \arg \min_{x \in \mathbb{R}^N} \Vert x - a\Vert \text{ s.t. } x \in \bigcap_i C_i$$ for some $a \in \mathbb{R}^N$?
Edit
The problem we are trying to solve is a 3D tomography reconstruction problem, and thus the variable $x$ takes gigabytes of RAM. There is already a POCS algorithm (A variant of Algebraic Reconstruction Technique (ART)) that finds a feasible point. So is there a way to use it as a black-box or adapt it to find a minimum-norm solution instead?
This can be done fairly easily using proximal algorithms. Let $\delta_i$ be the indicator function of $C_i$: \begin{equation} \delta_i(x) = \begin{cases} 0 & \text{if } x \in C_i, \\ \infty & \text{otherwise.} \end{cases} \end{equation} Your optimization problem can be written as \begin{equation} \text{minimize} \quad \| x - a \| + \sum_i \delta_i(x). \end{equation} The objective function is a sum of functions that have easy proximal operators. Thus, you can use the Douglas-Rachford method (together with the consensus trick) to solve this optimization problem.
The Douglas-Rachford method is an iterative method for minimizing the sum of two convex functions, each of which has an easy proximal operator. Since we have a sum of more than two functions, it may seem that Douglas-Rachford does not apply. However, we can get around this by using the consensus trick. We reformulate our problem as
\begin{align*} \text{minimize} & \quad \underbrace{\|x_0 - a \| + \sum_i \delta_i(x_i)}_{f(x_0,x_1,\ldots,x_m)} + \underbrace{\delta_S(x_0,x_1,\ldots,x_m)}_{g(x_0,x_1,\ldots,x_m)} \end{align*} where \begin{equation} S = \{(x_0,x_1,\ldots,x_m) \mid x_0 = x_1 = \cdots = x_m \} \end{equation} and $\delta_S$ is the indicator function of $S$. The variables in this reformulated problem are the vectors $x_0,x_1,\ldots,x_m$. The indicator function $\delta_S$ is being used to enforce the constraint that all these vectors should be equal. We are now minimizing a sum of two convex functions, $f$ and $g$, which have easy prox-operators because $f$ is a separable sum of easy functions and $g$ is the indicator function of a set that we can project onto easily. So we are now able to apply the Douglas-Rachford method.
The Douglas-Rachford iteration is just three lines, and the code for this problem could probably be written in about a page of Matlab (unless projecting onto the sets $C_i$ is very complicated).