Matrix Maximization

783 Views Asked by At

I would like to solve the following optimization problem for a matrix $X$ which is symmetric and positive-semidefinite: $$ \mathrm{maximize} \, \, \, f(X) = \log \mathrm{det} X - k_1 \log(k_2 + a^T X a) \\ \text{subject to} \, \, X \preceq W, $$ for a fixed matrix $W$ and known constants $k_1, k_2 > 0$.

I've tried to make it look like a convex optimization problem, but I haven't suceeded. If convex optimization doesn't work here, are there any kind of optimization algorithms that could be potentially helpful? I'm not an expert on optimization at all, so any kind of guidance or reference is much appreciated. Thanks in advance!

1

There are 1 best solutions below

2
On BEST ANSWER

As I said above in my comment, this is not a convex problem; your objective is the difference between two concave functions. However, you might consider a successive convex approximation approach.

Here's what I mean. I'm going to assume that $W$ is strictly positive definite. It must be, actually, or else your model is infeasible. First set $X_0=W$, and for $i=0,1,2,\dots,...$ do this:

  1. Compute $V_i = \nabla_X k_1 \log (k_2+a^TX_ia) = \frac{k_1}{k_2+a^TX_ia} aa^T$.
  2. Set $X_{i+1}$ to be the solution to the linearized problem: $$\begin{array}{ll} \text{maximize} & \log\det X - \langle V_i, X \rangle \\ \text{subject to} & X \preceq W \end{array}$$
  3. Repeat until $\|X_i-X_{i+1}\|$ is small.

I doubt you can guarantee that this will reach the global optimum, but then again, you can't make such guarantees typically for nonconvex problems anyway. So perhaps this will give you results that are "good enough".

If you use a SDP framework like CVX (disclosure, I wrote this), then the SDP subproblem looks like this:

cvx_begin sdp
    variable X(n,n) symmetric
    maximize( log_det(X) - trace(V'*X) )
    subject to
        X <= W
cvx_end