On matrix derivatives

32 Views Asked by At

I'm a newbie in convex optimization and I need to understand how we differentiate matrix and vectors (norms also).

So, how can I calculate gradient and hessian of these two functions? Images of equations are here

Basically, I don't understand how to bring huge formulas obtained after differentiation to a standard form. All norms are standard norms in $R^n$