Does anyone know any efficient method to solve the following problem?
$ (\alpha,\beta) = \text{argmax} \log \det (\alpha K_1 + \beta K_2)$
s.t. $c_1 \alpha + c_2 \beta = c_3, \alpha\geq0, \beta\geq 0$
where $K_1$ and $K_2$ are known positive semi-definite matrix and $c_1$, $c_2$ and $c_3$ are known constant.
Many thanks!
Here is a simple starting point, say if $K_1$ is (strictly) positive definite. Since $$\det(\alpha K_1 + \beta K_2) = \det(K_2) \det(\alpha K_2^{-1} K_1 + \beta I),$$ the problem is reduced to the case where one of the matrices is identity.
Now, let $K_2^{-1} K_1 = U \Lambda U^T$ be a spectral decomposition, where $\Lambda = \text{diag}(\lambda_1,\dots,\lambda_n)$. Then, the problem is $$ \max \sum_{i=1}^n \log(\alpha \lambda_i + \beta) $$ subject to constraints. Assuming that one of $c_1$ or $c_2$ is $>0$, say $c_1 > 0$, you can write $\alpha = c'_3 + c_2' \beta$ and turn the problem into a one dimensional problem on $\beta$.
If none of $K_1$ or $K_2$ is p.d., you might try adding a perturbation, say $K_2' = K_2 + \epsilon I$.