Partial derivative of a matrix product w.r.t. a vector

39 Views Asked by At

What is the partial derivative of

$$ w^T\beta \cdot \Sigma \cdot \beta^Tw $$

with respect to $w$?

Dimentions:

  • $w$ is a $N \times 1$ vector ("asset weights")
  • $\beta$ is a $N \times K$ matrix ("factor loadings")
  • $\Sigma$ is a $K \times K$ matrix ("factor covariance matrix")

Background: multi-factor risk modelling (see Chapter 1) in finance. This is the systematic component of the total portfolio risk in variance terms.

1

There are 1 best solutions below

0
On BEST ANSWER

Defining $A := \beta \cdot \Sigma \cdot \beta^{T}$ the derivative is $w^T \cdot (A + A^T)$.

To prove that we can use the product rule on the scalar form of your expression:

$$ \frac{d}{dw_k}\bigg( \sum_{j=1}^{N}w_j \sum_{i=1}^{N}w_iA_{ji} \bigg) = \sum_{j=1}^{N}\frac{dw_j}{dw_k}\sum_{i=1}^{N}w_iA_{ji}+\sum_{j=1}^{N}w_j\sum_{i=1}^{N}\frac{dw_i}{dw_k}A_{ji}= \sum_{i=1}^{N}w_iA_{ki} + \sum_{j=1}^{N}w_jA_{jk} $$

Rearranging them in a row vector, you get the matrix form I provided earlier.