I am familiar with methods for solving quadratic programming problems of the form
$$\max_{x\in C} \mu^T x- 0.5x^T \Sigma x,$$ where $$C = \{x\in \mathbb{R}^d: 1^Tx = 1, x\succeq 0\}.$$
For example the unconstrained solution is $x^*= \Sigma^{-1} \mu$ and the constrained solution can be obtained by any QP-Solver package.
Question:
What is the standard, if any, for solving optimization problems with objective function $$g(x) = \mu^T x -0.5x^T \Sigma x+\lambda^T q(x),$$ where $q:\mathbb{R}^d\to \mathbb{R}^l$ is twice-differentiable and concave, but not necessarily strictly concave, and $\lambda_i>0$?
Some notes:
If a solution exists to the unconstrained problem, then it is a fixed point of the function $$h(x) = \Sigma^{-1}[\mu +Dq(x)^T \lambda],$$ where $Dq(x)$ is the Jacobian matrix of $q$.
If a solution exists to the unconstrained problem, we can either run gradient ascent or a fixed-point iteration scheme due to 1. and then apply an optimal projection to the constrained set, if we want the constrained solution.