Suppose $Y = P\Sigma P^T$ where $\Sigma$ is a positive definite matrix and $P$ is a rectangular matrix with full row rank.
I want to solve the following matrix equation for $P$: \begin{align*} P\Sigma U - PVP^T Y^{-1}P\Sigma U + PVU = 0 \end{align*}
where $V$ is a positive definite matrix and $U$ is a rectangular matrix of full column rank.
One possibility that I can see from here is: $P\Sigma U = PVU = 0$.
Are there any other possibilities? What is the best way to solve this equation.
First, rewrite: $$P(\Sigma - VP^T Y^{-1} P \Sigma+V)U=0$$
Now, $P$ is a "wide" matrix (full row rank), and $U$ is a "tall" one (shape of $P^T$). $P$ only has a right inverse, so we can't eliminate it from the equation. But we see that the row-space will be important: columns of $(\Sigma - VP^T Y^{-1} P \Sigma+V)U$ must be orthogonal to rows of $P$ for this to hold.
When you are dealing with rectangular matrices and problems where row-spaces and column-spaces of matrices are not trivial, it's best to use SVD decomposition.
Split
$$P=ESF$$ where $E$ is an orthogonal square matrix, $S$ is a diagonal positive definite matrix and $F$ is a "wide" orthogonal matrix with rows parameterizing the row space of $P$. With that we can compute $$Y^{-1}=ES^{-1}(F\Sigma F^T)^{-1}S^{-1}E^T$$ where we have to be careful not to "invert" the matrix $F$ which is rectangular.
Then, observe the expression in your equation: $$P^T Y^{-1}P=F^T(F\Sigma F^T)^{-1}F$$
With that, we see that the equation is independent of $E$ and $S$, therefore if an orthogonal matrix $F$ solves the equation, then any $P=ESF$ does as well. The only important part is the row space of $P$ - the $F$. At this point, we can also see that we only need column space of $U$, so just assume $U$ is orthogonal.
$$F\subset \text{cokernel}((\Sigma - VF^T (F\Sigma F^T)^{-1} F \Sigma+V)U)$$
If there was no middle term, that would be the end. You would compute the cokernel on the right, if it had dimensions larger or equal to the rank $P$, then there would be solutions, otherwise not: $$F\subset \text{cokernel}((\Sigma + V)U)$$
As the middle term depends on $F$, we have to do more. $VF^T\ldots$ feeds the elements of rowspace of $F$ to the matrix $V$, so the requirement is now that $F$ must be such that $VF$ is in the complement of $F$. This is a further restriction on the choice of $F$.
If the basis $F$ is an invariant subspace of $\Sigma$, then the second term reduces to $VF^T F$ (acts as $V$ on vectors from rowspace of $F$, zero otherwise). In that case, we can rewrite $$F\subset \text{cokernel}((\Sigma +V(1-F^TF))U)$$ and we have a special case that signifies that we must choose an $F$ that is also from an invariant subspace of $V$ (no vectors from the complement of $F$ should map to $F$).
This is by no means a closed form solution, but shows a few properties.