Consider the following problem: given $A,B\in\mathbb{R}^{N\times P}$ for $N>P$, we wish to find $X\in\mathbb{R}^{N\times N}$ such that:
$$XA=B$$
Without further constraint, the solution is given by the Moore-Penrose pseudoinverse: $$X=BA^+=B(A'A)^{-1}A'$$
but I'm interested in a solution $X$ with zero diagonal, i.e., $X_{ii}=0\ \forall i\in[N]$. This is still a linear equation so it can be solved numerically, but is there a way to write the solution $X$ "in closed form"?
EDITED: fixed the wrong ("transposed") definition mentioned in some of the answers below
OK, there's a nice answer, which stems from a Lagrangian perspective on the source of the Moore-Penrose pseudoinverse solution for the unconstrained case.
The unconstrained diagonal case can be written as solving the following optimization problem: $${\cal L}=\frac{1}{2}\|X\|^2_F-Y\circ(XA-B)$$ where $Y\in\mathbb{R}^{N\times P}$ are Lagrange multipliers and $\circ$ denotes element-wise multiplication. Taking the derivative $\partial {\cal L}/\partial X$ and equating to $0$ this is solved for $X=YA'$ so that from $XA=B$ we have $Y=B(A'A)^{-1}$ and $X=BA^+$.
Moving over to the constrained diagonal case, now it can be written as solving the following optimization problem: $${\cal L}=\frac{1}{2}\|X\|^2_F-Y\circ(XA-B)-y\cdot diag(X)$$ where $Y\in\mathbb{R}^{N\times P}$ and $y\in\mathbb{R}^{N}$ are Lagrange multipliers. Taking the derivative $\partial {\cal L}/\partial X$ and equating to $0$ this is solved for $X=YA'+\Delta(y)$, where $\Delta(y)$ is a diagonal matrix with $y$ as the diagonal, so that from $XA=B$ we have $Y=(B-\Delta(y)A)(A'A)^{-1}$ and finally: $$X=BA^++\Delta(y)(I-AA^+)$$
Now the self-consistent solution for $y$ would be: $$y=-diag(BA^+)/diag(I-AA^+)$$