I have a question related to optimization.
Given natural numbers $n$ and $\ell$, matrices ${\bf K}_1, \dots, {\bf K}_\ell \in \Bbb R^{n \times n}$ and a vector ${\bf y} \in \Bbb R^n$, define $${\bf K} := s_1 {\bf K}_1 + \dots + s_\ell {\bf K}_\ell + \lambda {\bf I}_n$$ where $s_1, \dots, s_\ell, \lambda \in \Bbb R$, $\lambda > 0$ and ${\bf I}_n$ is the identity matrix. It is also given that the ${\bf K}_1, \dots, {\bf K}_\ell$ are all symmetric and semidefinite so ${\bf K}$ is invertible. The problem is to minimize ${\bf y}^\top {\bf K}^{-1} {\bf y}$ under the constraints $s_1, \dots s_\ell \ge 0$ and $s_1 + \dots + s_\ell = 1$.
I have searched that this problem is related to semidefinite programming (SDP) but since I have no background on optimization theory, I cannot proceed any further. How can I solve this problem?
Let me expand on my earlier comment and @RodrigodeAzevedo 's reference.
Your original problem is \begin{align*} \min_{s_1, \dots, s_\ell, \lambda, \mathbf{K}} & \quad \mathbf{y}^T \mathbf{K}^{-1} \mathbf{y}\\ \text{subject to} & \quad \mathbf{K} = s_1 \mathbf{K}_1 + \dotsb + s_\ell \mathbf{K}_\ell + \lambda \mathbf{I}_n\\ & \quad s_1, \dots, s_\ell \geq 0\\ & \quad s_1 + \dotsb + s_\ell = 1\\ & \quad \lambda > 0\\ & \quad \mathbf{K} \succ 0\\ \end{align*}
If you just want to find a numerical solution to the problem, then CVX or CVXPY can do the job. Use the
matrix_fracfunction (reference here) as your objective, write out all of the above constraints, and that should solve the problem. For guidance on using CVX or CVXPY, first check their documentation. They provide plenty of examples to get you started. In case you need additional help with coding, post your question on StackOverflow.If you want to understand how
matrix_fracworks and how it relates to semidefinite programming, then here are the details.First, write your problem in equivalent form \begin{align*} \min_{s_1, \dots, s_\ell, \lambda, \mathbf{K}, t} & \quad t\\ \text{subject to} & \quad \mathbf{y}^T \mathbf{K}^{-1} \mathbf{y} \leq t\\ & \quad \mathbf{K} = s_1 \mathbf{K}_1 + \dotsb + s_\ell \mathbf{K}_\ell + \lambda \mathbf{I}_n\\ & \quad s_1, \dots, s_\ell \geq 0\\ & \quad s_1 + \dotsb + s_\ell = 1\\ & \quad \lambda > 0\\ & \quad \mathbf{K} \succ 0\\ \end{align*}
We want to turn the constraints $\mathbf{y}^T \mathbf{K}^{-1} \mathbf{y} \leq t$, $\mathbf{K} \succ 0$ into a condition involving some matrix being positive semidefinite. The key is Schur complement: If you have a symmetric block matrix $X = \begin{pmatrix} A & B^T\\ B & C \end{pmatrix}$ and $C \succ 0$, then $X \succeq 0$ if and only if $A - B^T C^{-1} B \succeq 0$.
Using Schur complement, we find that the constraints are equivalent to $\begin{pmatrix} t & \mathbf{y}^T\\ \mathbf{y} & \mathbf{K} \end{pmatrix} \succeq 0$, $\mathbf{K} \succ 0$.
Now your problem can be expressed as \begin{align*} \min_{s_1, \dots, s_\ell, \lambda, \mathbf{K}, t} & \quad t\\ \text{subject to} & \quad \begin{pmatrix} t & \mathbf{y}^T\\ \mathbf{y} & \mathbf{K} \end{pmatrix} \succeq 0\\ & \quad \mathbf{K} = s_1 \mathbf{K}_1 + \dotsb + s_\ell \mathbf{K}_\ell + \lambda \mathbf{I}_n\\ & \quad s_1, \dots, s_\ell \geq 0\\ & \quad s_1 + \dotsb + s_\ell = 1\\ & \quad \lambda > 0\\ & \quad \mathbf{K} \succ 0\\ \end{align*} This is a bona fide semidefinite program: You just have a linear objective, linear equality and inequality constraints, and semidefinite matrix constraints.
Note that for implementation purposes, you don't need to manually express your problem in this last form.
matrix_fracautomatically does this reduction for you behind the scene.