Does the condition $\mathbf{u}^T \mathbf{J}\mathbf{u}= N\lambda_{\mathrm{max}}$ (for real vector $\mathbf{u}$ such that $|\mathbf{u}|^2=N$) necessarily imply that $\mathbf{u}$ is the eigenvector of the real symmetric matrix $\mathbf{J}$ corresponding to its largest eigenvalue $\lambda_{\mathrm{max}}$?
We know that, if $\mathbf{u}$ is the eigenvector corresponding to the largest eigenvalue, then $\mathbf{u}^T \mathbf{J}\mathbf{u}= N\lambda_{\mathrm{max}}$ holds. But does the converse hold?
If not, what extra conditions are needed to guarantee this?
My work:
Assume $\mathbf{u}$ is written as a linear combination of the eigenvectors $\mathbf{v}_k$ of $\mathbf{J}$ $$ \mathbf{u}=\sum_{k=1}^N a_k \mathbf{v}_k\ . $$ Then $$ \mathbf{u}^T \mathbf{J}\mathbf{u}=\sum_{k,m}a_ka_m \mathbf{v}_m^T\mathbf{J}\mathbf{v}_k=\sum_{k,m}a_ka_m \lambda_k\mathbf{v}_m^T\mathbf{v}_k=N\sum_{k=1}^N a_k^2\lambda_k\ , $$ where I use orthogonality between eigenvectors of $\mathbf{J}$, and normalisation of such eigenvectors to $N$.
Imposing now $$ N\sum_{k=1}^N a_k^2\lambda_k=N\lambda_{1}\ , $$ subject to $\sum_{k}a_k^2=1$ (where I assume $\lambda_1\equiv\lambda_{\mathrm{max}}$), obviously I see that $(a_1,\ldots,a_N)=(1,0,\ldots,0)$ is a solution of the above equation [which would correspond to $\mathbf{u}$ being precisely the 'largest' eigenvector], however I don't see why this solution should be unique (i.e., why can't I find another $N$-vector of $\{a_i\}$ values satisfying it).
Suppose $\sum_k a_k^2=1$. You wish to show that there is only one solution to the equation $\sum_k a_k^2\lambda_k=\lambda_1$.
Just divide out $\lambda_1$ to get $\sum_k a_k^2\frac{\lambda_k}{\lambda_1}=1$.
But $$\sum_k a_k^2\frac{\lambda_k}{\lambda_1}\le\sum_k a_k^2=1.$$
Clearly one must have $a_2=a_3=\cdots=0$.