Consider a semi-definite program with equality constraints where the variable $X$ is a block matrix. Denoting the different blocks of $X$ by $X_j$ where $j$ is an index, the standard form of the equality constraints is $$\sum_j {\rm tr}(A_{ij} X_j)=b_i,$$ where the $b_i$ are scalars and where the $A_{ij}$ are matrices. In addition one has a constraint that the $X_j$ are psd and one optimizes some linear function of the $X_j$.
Suppose we add a linear constraint of form $$\sum_j D_j X_j=0,$$ where the $D_j$ are matrices, where $D_j X_j$ denotes the product of the given matrices, and where the equality is a matrix equality. Such a linear constraint can be expressed in the standard form in an obvious way: simply find a set of matrices which is a basis (i.e., a basis for the vector space of matrices of given size), and require that the trace of $\sum_j D_j X_j$ with each matrix in that set vanishes. However, expressing this constraint in the standard form in that way can entail a significant memory overhead. For example, if all the $X_j$ and $D_j$ are $n$-by-$n$ matrices, then expressing it in standard form in this way requires $O(n^4)$ memory, while the $D_j$ require only $O(n^2)$ memory to store.
So, my questions: (1) is there a standard name for the kind of linear constraint that I mention? (2) do any currently available solvers support such a constraint?