EDIT: Original question was linear. I have changed the problem to be quadratic instead, replacing each $x_i$ with $x_i^2$, but the difficulty remains the same.
I am extremely rusty with Lagrange multiplier optimization, and as a refresher I was looking at the following problem, but for the life of me I cannot figure out why I cannot solve this.
I am trying to maximise: $$f(x_1, ..., x_n) = \sum^n_{i=1} x_i^2 c_i$$ subject to the constraint that $$\sum^n_{i=1} x_i^2 = 1$$
It is intuitively clear that the solution should be $|x_j| = 1$ for whatever $c_j$ is maximal, and $x_j = 0$ otherwise.
However attempting to show this using Lagragian multipliers, I consider $$\sum^n_{i=1}x_i^2 c_i - \lambda \left[1 - \sum^n_{i=1} x_i^2 \right]$$
The partial derivative with respect to $x_j$ yields $$2 x_j c_j + 2 x_j \lambda = 0$$
Then we must have that either $x_j = 0$ or $\lambda = -c_j$
The partial derivative with respect to $\lambda$ yields $$\sum^n_{i=1} x_i^2 = 1$$
I'm not sure how to proceed from here. Specifically, which $j \in [n]$ do we let $\lambda = -c_j$?
EDIT 2: Staring at the problem longer, I believe I figured out how to proceed, but I would appreciate if someone could let me know if the approach is valid/best. Starting from $j=1, ..., n$, I simply let $\lambda = - c_j$ which forces $x_k = 0$ for $k \neq j$.
Hence: $$\sum^n_{i=1}x_i^2 c_i - \lambda \left[1 - \sum^n_{i=1} x_i^2 \right] = x_j^2 c_j - x_j^2 c_j + c_j = c_j$$
Across all $j = 1, ..., n$, it is clear that we must choose $\lambda = -c_m$ where $c_m$ is maximal. Thus $x_k = 0$ for $k \neq m$, and thus $|x_m| = 1$.
Letting $e = (1,1,1, \cdots ,1)^{T},$ by the KKT conditions for the maximization problem in the linear version: $$\begin{align} c+\lambda e+\mu = &\ 0, \tag{stationarity}\\ \mu \geq 0 \text{ and } \lambda \in &\ \mathbb{R}\tag{dual feasibility}\\ x^{T} \mu = &\ 0, \tag{complementary slackness} \\ x^{T} e = 1, \text{and } x \geq & \ 0 \tag{primal feasibility}, \end{align}$$
Letting $x_i>0$, by the complementary slackness, $\mu_i = 0$, and, by and stationarity condition, thus $$ \lambda + c_i = 0. $$ This means that $c-c_i e=-\mu \leq 0.$ Hence, $$c_j \leq c_i,$$ for all $j.$ Thus, $\max_{j=1,\cdots, n}\{c_j\} \leq c_i \leq \max_{j=1,\cdots, n}\{c_j\} $. This means, $c_i= \max_{j=1,\cdots, n}\{c_j\}.$
To conclude, any $x$ with $e^{T}x = 1$ and $x \geq 0$, satisfying that, if $$c_i - \max_{j=1,\cdots, n}\{c_j\}<0,$$ we have $x_i=0,$ is a solution, since any of such $x'$s satisfies the KKT conditions for a linear programming problem. That's the solution set (necessarily not empty, since you found a solution).