I would like to solve the following quadratic optimization problem:
$$ \max \sum_{j}(\alpha_j d_j+0.5\beta_jd_j^2) - \sum_{i}(\gamma_i s_i+0.5 \delta_i s_i^2) - \sum_{i}\sum_{j}c_{ij}x_{ij} $$
$$ s.t. \;\;\; \sum_{i}x_{ij} = d_j \;\;\; \forall j \\ \;\;\;\;\;\;\;\;\sum_{j}x_{ij} = s_i \;\;\; \forall i $$
with variables $d_j$, $s_i$, $x_{ij}$ and parameters $\alpha_j$, $\beta_j$, $\gamma_i$, $\delta_i$, $c_{ij}$.
I can get the optimal solution using a solver for quadratic or non-linear problems in general. However, I would also like to solve it using Lagrange multipliers.
I assume that this would be my Lagrange function:
$$ L = \sum_{j}(\alpha_j d_j+0.5\beta_jd_j^2) - \sum_{i}(\gamma_i s_i+0.5 \delta_i s_i^2) - \sum_{i}\sum_{j}c_{ij}x_{ij} \\+ \sum_{j}\mu_{j} \left(\sum_{i}x_{ij}-d_j\right) + \sum_{i}\lambda_{i} \left(\sum_{j}x_{ij} - s_i \right)$$
Which gives me the following derivatives:
$$ \frac{\partial L}{\partial d_j} = \alpha_j + \beta_jd_j - \mu_j = 0 \;\;\;\;\; \forall j \\ \frac{\partial L}{\partial s_i} = \gamma_i + \delta_i s_i + \lambda_i = 0 \;\;\;\;\; \forall i \\ \frac{\partial L}{\partial x_{ij}} = c_{ij} -\mu_{j} - \lambda_{i} = 0 \;\;\;\;\; \forall i,j $$
Unfortunately, I cannot get a feasible solution for these equations using the same data and solver as for the original problem. So I guess that the derivatives are not correct, but I am not sure where I made the mistake.