Problem: Consider the optimization problems $$\min_\beta \|y-X\beta\|^2+\alpha\|\beta\|^2 \tag 1$$ and $$\min_\beta \|\beta\|^2 \text{ subject to } \|y-X\beta\|^2 \le c \tag 2$$ where $\|x\|$ is the $2$-norm. Fix $\alpha$, and suppose $\beta^*$ is the solution to ($1$), and let $c=\|y-X\beta^*\|^2$. Is it true that the solution to ($2$) is also $\beta^*$?
Attempt: I believe this is true. The argument should be very similar to the one in Why are additional constraint and penalty term equivalent in ridge regression?. However, I was running some numerical experiments and it turns out the two problems have different solutions. Hence my question here: are the two problems really yielding the same solutions? Are there exceptions that I should be careful of?
It is true for $\alpha>0$. Since $\beta^*$ is solution of (1), we have: $$\|y-X\beta^*\|^2 + \alpha\|\beta^*\|^2 \le \|y-X\beta\|^2 + \alpha\|\beta\|^2.$$ Reordering: $$\|\beta^*\|^2 \le \frac{1}{\alpha}(\|y-X\beta\|^2 - \|y-X\beta^*\|^2) + \|\beta\|^2.$$ Now, in (2) we take $\beta$ such that $\|y-X\beta\|^2 \le \|y-X\beta^*\|^2$, so we conclude that $$\|\beta^*\|^2 \le \|\beta\|^2,$$ which implies that $\beta^*$ is a minimum of (2).
Analogously, for $\alpha<0$ you can check that $\beta^*$ solution of (1) is also solution of (2), BUT maximizing instead of minimizing: $$\alpha(\|\beta^*\|^2 - \|\beta\|^2) \le \|y-X\beta\|^2 - \|y-X\beta^*\|^2 \le 0 \quad\Longrightarrow\quad \|\beta^*\|^2 \ge \|\beta\|^2.$$ Anyhow, note that you cannot assure equivalence, since the problem becomes non-convex for $\alpha<0$.