Given real symmetric matrix $H$, I am trying to numerically find all critical points of the function
$$\begin{aligned} f : \mathbb{R}^{3n} &\to \mathbb{R}\\ x &\mapsto x^T H x \end{aligned}$$
where $x$ consists of a stack of $n$ unit vectors from $\mathbb{R}^3$, i.e. for all values of $j=1...n$, $x_j^2 + x_{j+1}^2 + x_{j+2}^2 = 1$.
In other words, I'm trying to optimise $ x^T H x$ over the manifold $x \in \otimes_{i=1}^n S^2 \subset \mathbb{R}^{3n}$.
General optimization routines for solving this kind of problem are readily available, but I wonder if there's a better methodology considering that both the objective and the constraint are quadratic forms whose derivatives are linear functions of $x$.
I'm attempting to do this by Lagrange multipliers, (dropping an overall factor of $2$)
$$Hx = \begin{pmatrix}\lambda_1 I\\ & \lambda_2I \\ && \dots\\ &&& \lambda_n I \end{pmatrix}x \equiv \Lambda x $$
where $I$ is a $3 \times 3$ identity matrix. I'm stuck here - how would I go about numerically classifying the set of lambda values for which the kernel of $(H - \Lambda) $ is nontrivial, while ensuring that the eigenvector(s) can satisfy the normalisation condition?
I've tried using a numerical root finder on $\det(H - \Lambda)$, but the solution is a $n-1$ dimensional subset of $\Bbb R^n$, and almost no solutions satisfy the constraint.