Minimize $\displaystyle\sum_{i=1}^n \frac{x_i^2}{w_i}$ subject to $\displaystyle\sum_{i=1}^n x_i=1$.
The answer is $x_i=\displaystyle\frac{w_i}{\sum_i w_i}$ but I don't know why apart from plugging it in after finding the first derivative and setting to $0$. A hint is appreciated!
Edit: I get $$\begin{align} \Lambda(v_j,\lambda) &= \sigma^2\sum\frac{v_j^2}{w_j}+\lambda\left(\sum v_j-1\right) \\ \frac{d}{dv_j}\Lambda(v_j,\lambda) &= 2\sigma^2\sum\frac{v_j}{w_j}+\lambda=0 \\ \frac{d}{d\lambda}\Lambda(v_j,\lambda) &= \sum v_j-1 = 0. \end{align}$$ Then $\lambda=-2\sigma^2\sum\frac{v_j}{w_j}$.
Not sure what's supposed to happen next.
Without loss of generality $\sum\limits_{i=1}^nw_i=1$.
Now you can complete the squares like this $$\sum_{i=1}^n\frac{x_i^2-2x_iw_i+w_i^2}{w_i}=\sum_{i=1}^n\frac{(x_i-w_i)^2}{w_i}$$ And note that $\sum\limits_{i=1}^n\frac{2x_iw_i}{w_i}=\sum\limits_{i=1}^n2x_i=2$ is a constant, so it's equivalent to minimizing this sum of squares.
So $x_i=w_i$ is the only minimum, because $\sum\limits_{i=1}^nx_i=\sum\limits_{i=1}^nw_i=1$ and if $x_i\ne w_i$ for any $i$, we can't have a minimum.
This is your answer as by assuming $\sum\limits_{i=1}^nw_i=1$ we just replaced each $w_i$ by $\frac{w_i}{\sum_{i=1}^nw_i}$.