Let $ M $ be a positive semi definite matrix.
I want to solve $$ \arg \min_{x} {x}^{T} M x \quad \mathrm{s.t.} \quad \left\| x \right\| = 1, \ x \succeq 0 $$ where $ x \succeq 0 $ means each coordinate of $x$ is nonnegative.
Is there a standard approach for attacking this problem?
Without the inequality constraint, the usual approach is to expand $ x $ in the eigenvector basis of $ M $ and to notice that the solution must be the eigenvector with least eigenvalue.
Without the equality constraint, obviously the solution is $ x = \boldsymbol{0} $.
I would use the Projected Gradient Descend for this case.
Though the problem isn't Convex it will work nicely.
The algorithm is as following:
In a simple 2D example I created which worked pretty well:
The code is available at my StackExchange Mathematics Q2699867 GitHub Repository.
Remark 001
I'd even considering starting with the solution of the Convex Problem when you replace the equality constraint $ \left\| x \right\|_{2} = 1 $ by $ \left\| x \right\|_{1} = 1 $ (This will make the problem basically constraining $ x $ to the Unit Simplex). You can either use it as a starting point for the above algorithm or approximated solution by itself.
Remark 002
Another approach might be something like I did in - Solution for $ \arg \min_{ {x}^{T} x = 1} { x}^{T} A x - {c}^{T} x $.
Yet after each iteration of updating $ \lambda $ you should also project the output $ x $ into $ \mathbb{R}_{+} $.