Indefinite Matrix Constraint in a Semidefinite Program

57 Views Asked by At

I'm trying to formulate a feasibility problem ($\min 0$ ) with the following variables: $X_1,X_2\in\mathbb{S}^k,X_{1,i},X_{2,i}\in\mathbb{S}^n$, $i=1,...,K$, where $X_1$ and $X_2$ are indefinite, but each $X_{1,i}$ and $X_{2,i}$ is PSD. Also, $X_1$ and $X_2$ are diagonal, and let $R\in\mathbb{S}^k$ be a diagonal matrix with known real elements.

The problem itself is:

$$\min 0$$ $$\text{such that } X_1-X_2 = R$$ $$ Y_{1,k} - \sum_{j=1}^K a_j X_{1,j,k} = 0 \qquad k=1,...,K_1$$ $$ Y_{2,k} - \sum_{j=1}^K b_j X_{2,j,k} = 0 \qquad k=1,...,K_2$$

where each $Y_{i,j},Y_{2,j}$ are in $\mathbb{S}^n$, and contain elements of $X_1$ and $X_2$ respectively. Is there any way to (a) solve this proble,; (b) recast this as an LMI; and (c) solve this problem without using any slack variables to increase the computational cost?

Thanks in advance!