Suppose that I have a matrix $X$ in a (homogeneous) system of linear equations of the form $AXB=0$ and I want to find a non-trivial matrix $X$ (i.e. $X \neq 0$). A possible solution is given by constraining matrix $X$ to have unitary norm, i.e., $\|X\|_F^2=1$. In this case I can solve: $$(B^\top \otimes A)vec(X)=0$$ by taking the SVD of the matrix $(B^\top \otimes A)$ and set for instance $vec(X)$ equal to a right eigenvector of the matrix $(B^\top \otimes A)$ associated to a singular value equal to zero.
But how can I enforce other kind of constraints, for instance that $\operatorname{det}(X) \neq0$? Do I have any closed form solution like the SVD above?
Notice that the matrix $Q=B^\top \otimes A$ is in general non-square.
Note that if a solution to $AXB = 0$ exists with $X$ square of size $n$ and $\det(X)\neq 0$, then by Sylvester's rank inequality we have $$ \begin{align} 0 &= \operatorname{rank}(AXB) \\ & \geq \operatorname{rank}(A) + \operatorname{rank}(XB) - n = \operatorname{rank}(A) + \operatorname{rank}(B) - n, \end{align} $$ so that $\operatorname{rank}(A) + \operatorname{rank}(B) \leq n$. Equivalently, $\operatorname{rank}(B) \leq \dim \ker(A)$. If $A$ and $B$ satisfy this condition, then a matrix $X$ will satisfy $AXB = 0$ iff it satisfies the following.
One solution $X$ can be generated as follows. Let $u_1,\dots,u_k$ be an orthonormal basis for the kernel of $A$, which can be obtained using the right singular vectors of $A$. Extend this to a full orthonormal basis $u_1,\dots,u_n$ of $\Bbb R^n$. Let $v_1,\dots,v_r$ (with $r\leq k$) be a basis for the image of $B$, extend it to a basis $v_1,\dots,v_n$ of $\Bbb R^n$. Let $U$ denote the matrix with columns $v_1,\dots,v_n$ and let $V$ denote the matrix with columns $u_1,\dots,u_n$.
The matrix $X = UV^T$ will necessarily satisfy $AXB = 0$, and we will have $|\det(X)| = 1 \neq 0$.
We can say a bit more than this. Let $U= [U_1 \ \ U_2]$, $V = [V_1\ \ V_2]$ where $U_1$ has $k$ columns and $V_1$ has $k$ columns. Suppose that $X = UYV^T$, so that $$ X = \pmatrix{U_1 & U_2}\pmatrix{Y_{11} & Y_{12}\\Y_{21} & Y_{22}}\pmatrix{V_1 & V_2}^T \\ = U_1Y_{11}V_1^T + U_1Y_{12}V_2^T + U_2Y_{21}V_1^T + U_2 Y_{22}V_2^T. $$ For any $v \in \operatorname{im}(B)$, $V_2^Tv = 0$, which means that $$ Xv = (U_{1}Y_{11})(V_1^Tv) + (U_{2}Y_{21})(V_1^Tv). $$ Whenever $U_2Y_{21} \neq 0$, it holds that $(U_{21}Y_{21})(V_1^Tv) \neq 0$ for some $v \in \operatorname{im}(B)$. For this $v$, it holds that $Xv \notin \ker(A)$, which is to say that $X$ does not satisfy the Condition 1. Conversely, if $U_2Y_{21} = 0$, it is easy to see that $X$ will satisfy Condition 1. Thus, $X$ will be a solution iff $U_2Y_{21} = 0$, which holds iff $Y_{21} = 0$.
Thus, $X$ is a solution to the equation $AXB = 0$ iff it is of the form $$ X = \pmatrix{U_1 & U_2}\pmatrix{Y_{11} & Y_{12}\\0 & Y_{22}}\pmatrix{V_1 & V_2}^T. $$ This matrix will be invertible iff both $Y_{11}$ and $Y_{22}$ are invertible.