Find the maximum value without computing matrix

70 Views Asked by At

I have a optimization problem, which is to find a certain $h^*$ that:$$h^* = argmax(h'\alpha-\frac{\kappa}{2}h'\Sigma h)$$ where $\alpha$ is a $(n \times 1)$ vector and $\Sigma$ is a $(n\times n)$ matrix. The $\Sigma$ matrix is of form $$\Sigma = XFX' + D$$ where $X$ is a $(n\times k)$ matrix and $F$ is a covariance matrix of factor loadings, and $D$ is a diagonal matrix. $D = diag(\sigma_1^2,..,\sigma_n^2)$.

How can I write a function to compute $h^*$ without computing the a $(n\times n)$ matrix?

1

There are 1 best solutions below

5
On

A covariance matrix is symmetric. After multiplying with the presumed orthogonal matrix $X$ and its transpose, that will still be the case. The diagonal matrix is also symmetric. So their sum will be as well. In particular it means that $\Sigma'=\Sigma$.

When $h$ is optimal, each partial derivative of the expression must be $0$. Starting with the first:

$$\frac{\partial}{\partial h_1}(h'\alpha -\frac\kappa 2h'\Sigma h) = e_1'\alpha - \frac\kappa 2e_1'\Sigma h - \frac\kappa 2h'\Sigma e_1 = e_1'\alpha - \kappa e_1'\Sigma h = \alpha_1 - \kappa \sigma_1 h = 0 $$ where $\sigma_1$ is the first row vector of $\Sigma$.

Extending to a vector and to a matrix again: $$\nabla (h'\alpha -\frac\kappa 2h'\Sigma h) = \alpha - \kappa \Sigma h = 0$$

We can find $h^*$ now by solving: $$\Sigma h^* = \frac 1\kappa \alpha \implies h^*=\frac 1\kappa\Sigma^{-1}\alpha$$

This can be done with for instance Gaussian elimination, or otherwise with a more advanced method that is more robust.