I've encountered an optimization problem which would appreciate some help with :
the problem is stated as follows :
$$\min_{\alpha_1,...,\alpha_N}. \sqrt{\sum^N_{i=1} \alpha_i^2 \sigma_i} + \sum^N_{i=1}\alpha_i\beta_i + \mu $$
s.t $$\sum^N_{i=1}\alpha_{i} = 1, \hspace{1cm}\alpha_i > 0\hspace{0.5cm}\forall i $$ ,
$$\sigma_i,\beta_i,\mu >0 \hspace{0.5cm} \, \forall i = 1,...,N$$ are positive constants
I tried using lagrange multipliers but reached some form in which i couldn't continue further.
Can anyone suggest another approach that I can follow ?
Ignoring the positiveness of $\alpha$ The Lagrangian reads
$$ L(\alpha,\lambda,\mu) = f(\alpha) - \lambda\left(\sum_{i=1}^N\alpha_i-1\right) $$
where $f(\alpha) = \sqrt{\sum_{i=1}^N\sigma_i\alpha_i^2}+ \sum_{i=1}^N\beta_i\alpha_i$. The stationary points for $L$ are the solutions for
$$ \nabla L = \cases{\frac{\sigma_k\alpha_k}{\sqrt{\sum_{i=1}^N\sigma_i\alpha_i^2}}+\beta_k-\lambda=0,\ \ k = 1,\cdots,N\\ \sum_{i=1}^N\alpha_i=1 } $$
or
$$ \nabla L = \cases{\alpha_k=\frac{1}{\sigma_k}\left(\lambda_0-\sqrt{\sum_{i=1}^N\sigma_i\alpha_i^2}\beta_k\right),\ \ k = 1,\cdots,N\\ \sum_{i=1}^N\alpha_i=1 } $$
to eliminate $\lambda_0$ we add for all $k$ the first $N$ equations giving
$$ 1 = \lambda_0\sum_{i=1}^N\frac{1}{\sigma_i}-\sqrt{\sum_{j=1}^N\sigma_j\alpha_j^2}\sum_{i=1}^n\frac{\beta_i}{\sigma_i} $$
and after $\lambda_0$ substitution we get
$$ \alpha_k = \frac{1}{\sigma_k}\left(\frac{1}{S_{\sigma}}\left(1+\sqrt{S_{\alpha}}S_{\beta/\sigma}\right)+\sqrt{S_{\alpha}}\beta_k\right) $$
Here
$$ \cases{ S_{\alpha} = \sum_{j=1}^N\sigma_j\alpha_j^2\\ S_{\beta/\sigma} = \sum_{i=1}^N\frac{\beta_i}{\sigma_i}\\ S_{\sigma} = \sum_{i=1}^N\frac{1}{\sigma_i} } $$
This give us an iterative formula to calculate successive $\alpha_k$. Follows a MATHEMATICA script implementing the procedure. This script corrects the negative $\alpha_k$ to $0$. This procedure usually converges and the results can be compared with the corresponding minimization command
NMinimize