Let $A,B\in\mathcal{M}_{m×n}(\mathbb{F})$ be rectangular matrices ($m\le n$) over an arbitrary field $\mathbb{F}$, such that $A$ is of full row rank. Moreover, $\require{enclose} \enclose{horizontalstrike}{AB^T}A^TB$ is a symmetric matrix. Prove that there exists a symmetric solution to the matrix equation $XA=B$, where $X\in\mathcal{M}_m(\mathbb{F})$.
I've been solving a problem in multilinear algebra and it reduced to this problem.
First of all note that the system indeed has solutions. To prove this note that $XA=B$ has solutions iff $xA=B_i$ has solutions for $x$ where $B_i$ is the $i$-th row of $B$. This is obviously equivalent to $A^Tx={B_i}^T$ having a solution. Now note that since $A$ is of full row rank, $Ax=b$ has solutions for all $b\in\mathbb{F}^m$. Hence there exist $x_i\in\mathbb{F}^n : Ax_i=e_i$. Also since $A^TB$ is symmetric, $A^TB=B^TA\implies A^TBx_i=B^TAx_i=B^Te_i=(B^T)^i$ where $(B^T)^i$ is the $i$-th column of $B^T$. Therefore $(B^T)^i=(B_i)^T\implies A^T(Bx_i)=(B_i)^T$ which proves the claim. Hence, the system has a solution. (namely $X_0$) Now note that being of full row rank implies existence of at least one right inverse for $A$. Therefore $X_0A=B\implies X_0=BA^+$ is a solution to the system, where $A^+$ denotes the right inverse of $A$. (which is well-known not to be necessarily unique)
Then, I was left to prove that $BA^+$ is symmetric. Although despite trying to tackle this claim numerous times, I've not succeeded in doing so. (actually I guess there would be counterexamples to it)
Another idea is to note that $XA=B\implies A^TX^T=B^T$. Hence, finding a symmetric solution to $XA=B$ is equivalent to finding simultaneous solutions to $XA=B, A^TX=B^T$. So I tried creating left, right inverses $A^+,{(A^T)}^-$ such that $({A^T})^-B^T=BA^+$ by noting that for any right inverse $A^+$: $$A^++[v\quad v\quad\dotsb\quad v] , \forall v\in \mathbf{ker}(A)$$ is also a right inverse of $A$, noting that $\mathbf{ker}(A)$ is non-empty due to rank-nullity theorem unless $m=n$. Though I failed with this approach as well.
By the way the original problem states that if $V$ is a vector space over some arbitrary field $\mathbb{F}$ and $\alpha_1,\dotsb,\alpha_m,\beta_1,\dotsb,\beta_m\in V^*$ are given such that the set $\{\alpha_1,\dotsb,\alpha_m\}$ is linearly independent and $\sum\alpha_i\wedge\beta_i=0$, then there exists a symmetric matrix $A\in\mathcal{M}_m(\mathbb{F})$ such that: $$\forall i\in[m]:\quad \beta_i=\displaystyle\sum_{j=1}^m a_{i,j}\alpha_j$$ Which I was able to transform to the current problem by simply considering the $\alpha_i$'s and $\beta_i$'s as linear combination of dual basis elements.
Any hint, new idea or partial solution would be much appreciated.
Thanks to the comment of a user, I spotted a typo in my problem statement and noted that the problem implies $A^TB$ to be symmetric, not $AB^T$. The problem statement is now edited. After noting this out, it's quite obvious. To prove that $BA^+$ is symmetric, note that since $A^TB$ is symmetric: $$A^TB=B^TA\implies(A^+)^TA^TB=(A^+)^TB^TA$$ Also note that $$AA^+=I\implies(A^+)^TA^T=I^T=I$$ So one can conclude: $$B=(A^+)^TB^TA\implies BA^+=(A^+)^TB^T=(BA^+)^T$$ Which wraps up the proof.