I'm trying to write a Steepest Descent algorithm in Matlab and I have to solve a line search problem via a Newton's Method subroutine. To make it work, I have to compute the gradient of the following function:
$A \in \mathbb{R}^{n*n},\\ u,f, g \in \mathbb{R}^n,\\ \lambda, h, p\in \mathbb{R}$
The differentiation is with respect to $p$ and the desired result should output scalars
Also, Matlab uses row vectors by default so $u$ would be a row vector and $u^T$ a column vector
$$J(p) = -(u-pg)Ag^T \;-\;\lambda h^2e^{(u-pg)}g^T \; + \; h^2fg^T$$
When I try differentiating with respect to $p$, I immediately get that the last term should vanish, and I believe the first term should become $$gAg^T$$ so that it returns a scalar. Then the trouble for me comes with the middle term. I'm not sure how to differentiate it correctly so that it's compatible with the rest of the solution, since it has to output a scalar....
One simple approach is just to write out $$ e^{u - pg} g^T = \sum_i e^{u_i - p g_i} g_i. $$ We can now differentiate with respect to $p$ term by term, obtaining $\sum_i e^{u_i - p g_i}(-g_i) g_i $.