Is it rigorous? Solve an optimization problem by letting a parameter approach to $0$ that should be exactly $0$.

53 Views Asked by At

Assume we have a simple mean-variance model given by $\min_{x\in\mathcal{X}} \lambda x^{\top}\Sigma x - \mu^{\top}x$, where $ \mathcal{X} = \{x:1^{\top}x=1,x\geq 0 \} $ and $\Sigma$ is positive definite . If one totally ignores risk, he may set $\lambda = 0$. Here is my problem. Instead of setting $\lambda = 0 $ directly (in this case, the resulting model $\min_{x\in\mathcal{X}} - \mu^{\top}x$ is easy to solve), can I take $\lambda = \epsilon $ where $ \epsilon >0$ is a small number and then obtain the results by letting $\epsilon $ approach to $0$? Is it a rigorous or reasonable way of solving a general optimization problem with a parameter equal to $0$?