Continuous Dependence on Regularization Constant

35 Views Asked by At

In machine learning, many models end up with formulating an optimization problem like $L(x) + \lambda R(x)$ where $L(x)$ is a loss function and $R(x)$ is a regularization function. Let's say both $L(x)$ and $R(x)$ have nice regularity and they are convex, is it true that $x^\star(\lambda) = argmin_{x \in \mathbb{R}} \{L(x) + \lambda R(x)\}$ depends continuously on $\lambda$? If so, how to prove it?