Recently I am studying a problem which boils down as follows:
$$\epsilon_1 = \underset{\epsilon}{\mathop{\arg \min}}\; \epsilon^{\top}\epsilon$$ $$s.t.\quad \mathbb{a}^{\top}\epsilon + R\|\mathbb{b} + \epsilon\|_{2} = c,$$
where $\mathbb{a}$, $\mathbb{b}$, $c$ and $R$ are given and $R$ is positive. Suppose that the problem above is feasible. I was wondering how the optimal solution changes if $R$ is perturbed by a small quantity $\delta$, which results in the following problem:
$$\epsilon_2 = \underset{\epsilon}{\mathop{\arg \min}}\; \epsilon^{\top}\epsilon$$ $$s.t.\quad \mathbb{a}^{\top}\epsilon + (R+\delta)\|\mathbb{b} + \epsilon\|_{2} = c.$$
My intuition is that $\|\epsilon_1 - \epsilon_2\|$ is $O(\|\delta\|)$ (when $\delta$ is small enough), but I don't have the exact analytical solution. I found in this reference, property 1 shows that if $c$ rather than $R$ is perturbed by $\delta$, then the claim that $\|\epsilon_1 - \epsilon_2\|$ is $O(\|\delta\|)$ seems correct.