Suppose we are given $A\in\mathbb{R}^{mn\times mn}$, with $A$ symmetric positive definite.
We wish to find $B\in\mathbb{R}^{m\times m}$, $C\in\mathbb{R}^{n\times n}$ and $D\in\mathbb{R}^{mn\times mn}$, with $B$, $C$ and $D$ all symmetric positive semi-definite, such that $A=B\otimes C+D$ and such that $\| D \|_F$ is as small as possible.
Without the constraint that $D$ is positive semi-definite (henceforth p.s.d.), this would be a standard "Nearest Kronecker Product" problem, whose solution is discussed by Van Loan and Pitsianis (1993) https://doi.org/10.1007/978-94-015-8196-7_17. (See also this question: Method to reverse a Kronecker product .)
However, with the constraint that $D$ is p.s.d., it appears that this problem is much harder. Is there an alternative to numerical optimisation to solve it?
I had two ideas for avenues of attack though neither seems to have amounted to much.
Using the fact that positive definite matrices may be simultaneously diagonalized on $A$ and $\hat{B}\otimes\hat{C}$ where $\hat{B}$ and $\hat{C}$ are the solutions to the problem without the p.s.d. constraint on $D$. The problem is that the matrix which simultaneously diagonalizes these will not have the required Kronecker structure.
Using the connection to the generalized eigenvalue problem: if the constraint that $D$ is p.s.d. binds, then there must exist some $v\in\mathbb{R}^{mn}$ and some $\lambda\in\mathbb{R}$ s.t. $(A-\lambda (B\otimes C))v=0$. This has helped simplify numerical optimisation, as it means we can directly impose the constraint, but I haven't been able to get further analytical results out of it.