Using gradient descent and Newton's method combined

1.6k Views Asked by At

I have this function $f(\mathrm{X})$ where $\mathrm{X=A+B+C}$ where $\mathrm{A}$ is a diagonal element with variable $a$ on its diagonal. $\mathrm{B}$ is another diagonal matrix with variable $b$ on its diagonal. $\mathrm{C}$ is a positive definite matrix with variable $c_{ij}$. Now I want to optimize this function over these variables. I was wondering to keep $\mathrm{A}$ and $\mathrm{B}$ constant first optimize over $\mathrm{C}$ using Newton's method. Then I will keep $\mathrm{C}$ constant and optimize over $a$ and $b$ using gradient descent.
I am not sure if this is applicable or will work. Suggestions guys?

1

There are 1 best solutions below

7
On

Here's some sample Matlab code that uses CVX to solve a problem similar to yours. However, I don't yet know how to handle the $r^T X^{-1} r$ term.

N = 30;
y = randn(N,1);

cvx_begin sdp

    variable X hermitian
    minimize(.5*y'*X*y - log_det(X))
    subject to
        X >= 0

cvx_end