$\newcommand{\vx}{\mathbf{x}}$ While solving a certain problem, I bumped into the following Lagrangian. $$ \mathcal{L}(\vx, \lambda) = \frac{(e^\vx) ^\top A e^\vx}{(e^\vx)^\top B e^\vx} - \lambda ((e^\vx)^\top \mathbf{1} - 1) $$ Here $\vx\in\mathbb{R}^d$, and $e^\vx$ means that we exponentiate each of its elements. $\lambda$ is the Lagrange multiplier for the condition that the elements of $e^\vx$ sum up to one. I would like to solve the optimization problem and I am therefore interested in $\nabla_\vx \mathcal{L}(\vx, \lambda)$.
Attempt at the derivative
I have never bumped into a problem like this, so I am unsure how to take derivatives properly.
$$ \begin{align} \nabla_\vx \left[(e^\vx) ^\top A e^\vx\right] &= (e^\vx \mathrm{I})(2 A e^\vx) = 2(e^\vx)^\top Ae^\vx \\ \nabla_\vx\left[(e^\vx) ^\top B e^\vx\right] &= 2 (e^\vx)^\top B e^\vx \end{align} $$ which would give $$ \nabla_\vx \mathcal{L}(\vx, \lambda) = \frac{2((e^\vx)^\top A e^\vx)((e^\vx)^\top B e^\vx) - 2((e^\vx)^\top B e^\vx)((e^\vx)^\top A e^\vx)}{4 ((e^\vx)^\top B e^\vx)^2} - \lambda e^\vx = -\lambda e^\vx $$ but surely this is incorrect
$\newcommand{\vx}{\mathbf{x}}$ I think the problem arises with how you compute $\nabla_\vx e^\vx$.
Here is what I think is the correct way to compute the derivative: $$ \begin{align} \nabla_\vx \left[(e^\vx) ^\top A e^\vx\right] &= \frac{\partial e^\vx}{\partial\vx}A e^\vx + \frac{\partial A e^\vx}{\partial\vx} e^\vx \\ &= \begin{bmatrix} \frac{\partial e^{x_1}}{\partial x_1} & \frac{\partial e^{x_2}}{\partial x_1} & \cdots &\frac{\partial e^{x_n}}{\partial x_1} \\ \frac{\partial e^{x_1}}{\partial x_2} & \frac{\partial e^{x_2}}{\partial x_2} & \cdots &\frac{\partial e^{x_n}}{\partial x_2} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{\partial e^{x_1}}{\partial x_n} & \frac{\partial e^{x_2}}{\partial x_n} & \cdots &\frac{\partial e^{x_n}}{\partial x_n} \end{bmatrix} A e^\vx + \frac{\partial e^\vx}{\partial\vx}A ^\top e^\vx \\ &= \begin{bmatrix} e^{x_1} & 0 & \cdots & 0 \\ 0 & e^{x_2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0& \cdots & e^{x_n} \end{bmatrix} A e^\vx + \begin{bmatrix} e^{x_1} & 0 & \cdots & 0 \\ 0 & e^{x_2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0& \cdots & e^{x_n} \end{bmatrix} A ^\top e^\vx \\ &=\left( \begin{bmatrix} e^{x_1} & 0 & \cdots & 0 \\ 0 & e^{x_2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0& \cdots & e^{x_n} \end{bmatrix} A + \begin{bmatrix} e^{x_1} & 0 & \cdots & 0 \\ 0 & e^{x_2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0& \cdots & e^{x_n} \end{bmatrix} A ^\top \right)e^\vx \\ \end{align} $$