How to decompose this time-varying optimization problem (semidefinite program) in order to solve it?

65 Views Asked by At

Let us consider the following semidefinite program \begin{align*} &\min_{X_{t}, \Omega_t, \Pi} \quad \frac{1}{T+1} \sum_{t=0}^{T} \text{Tr} (X_t) \quad \text{s.t.}\\ &\begin{bmatrix} X_{t} & L_t \\ L_t^{\text{T}} & \Omega_{t} \end{bmatrix} \succeq 0, \;\; 0 \leq t\leq T, \\ & \begin{bmatrix} C_{t+1}^{\text{T}} \Pi C_{t+1} - \Omega_{t+1} + \Xi_{t} & \Xi_{t} A_{t} \\ A_{t}^{\text{T}} \Xi_{t} & \Omega_{t} + A_{t}^{\text{T}} \Xi_{t} A_{t} \end{bmatrix} \succeq 0, 0 \leq t \leq T-1, \\ &\begin{bmatrix} I_{p_{i}}/\alpha_{i}^{2} + V_i^{-1} & E_{i}^\text{T}\\ E_{i} & V-V \Pi V \end{bmatrix}\succeq0,\;\; 1 \leq i\leq n. \end{align*} The third constraint contains variables that depend both on t and t+1 (t is the time). Does anyone know how to decompose this semidefinite program in order to solve the optimization problem efficiently from computational point of view please(interior-point methods, Lagrangian, primal-dual, dynamic programming etc.)? Thanks.