I am trying to prove the following thing.
Suppose $x_i$ satisfies the following equation,
(0) $\sum_{i=1}^{L}{x_i^2}=1$ where $x_i\ge0, L\in \mathbb{N}$.
Then, the solution of maximizing $\sum_{i=1}^{L}{x_{i}^{p_i}}$ under condition of ($1< p_i < p_j <2, \,1\le i < j <L$) exists when $x_i<x_j$.
I was tried to prove this by using Lagrangian multipliers but I stuck to the following a (3) step.
(1) Lagrangian function: $\mathcal{L}=\sum_{1}^{L}{x_{i}^{p_i}}+\lambda(\sum_{1}^{L}{x_i^2}-1$)
(2) $\partial\mathcal{L}/\partial x_i=0$ $\rightarrow$ $x_i=\left(\frac{-p_i}{2\lambda}\right)^\frac{1}{2-p_i}$
(3) Substituting (2) into (0), $\lambda=?$
It would be really nice to let me know how to prove this.