Let $f:\mathbb{R}^{n\times m}\rightarrow \mathbb {R}$ be a function that takes an $n\times m$ matrix $X$ and maps it to the real line. Suppose that the derivative of $f$ with respect to one element $X_{ij}$ is $$ \frac{df}{dX_{ij}}=a^\top XE1_jb_i $$ where $a$ is an $n\times 1$ vector, $E$ is an $m\times m$ positive definite matrix, $b$ is an $n\times 1$ vector, and $1_j$ is the $j$th standard basis vector of $\mathbb{R}^m$.
My question is: What is $f$? Does such a $f$ even exist?
If $a=b$, I believe that the answer is
$$f=\frac{1}{2}a^\top X E X^\top a+\text{cte},$$
but I can't seem to generalize this result. Similarly, if $m=n=1$ it is trivial to find $f$.
$\def\rank{\mathrm{rank}}$ $\def\pdv#1#2{\frac{\partial #1}{\partial #2}}$ $\def\inv#1{\left( #1 \right)^{-1}}$ $\def\nspace#1{\mathrm{null}\left( #1 \right)}$ $\def\dim#1{\mathrm{dim}\left( #1 \right)}$ $\def\null#1{\mathrm{null}\left( #1 \right)}$ $\def\tinybox#1{\mbox{\tiny #1}}$ $\def\range#1{\mathrm{range}\left( #1 \right)}$
$1_j$ is the $j$th standard basis vector of $\mathbb{R}^m$, $e_j$, so $Ee_j$ is equal to the $j$th column of $E$ and the expression you have written is $$\pdv{ f}{ X_{ij}}=\sum_{k=1}^n \sum_{l=1}^m a_k X_{kl} E_{lj} b_i$$ Lets treat $f$ as a multivariate function where the variables are the elements of the matrix $X$. The expression in the r.h.s. contains the term $a_i X_{ij} E_{jj} b_i $; we can only get this term in the partial derivative if $f=\frac{1}{2} a_i X_{ij} E_{jj }X_{ij} b_i+\ldots$. We also have $$ \sum_{\substack{k=1 \\ k\neq i}}^n \sum_{\substack{ l=1 \\ l \neq j}}^m a_k X_{kl} E_{lj} b_i $$ which is the partial derivative w.r.t. $X_{ij}$ of, $$\sum_{\substack{k=1 \\ k\neq i}}^n \sum_{\substack{ l=1 \\ l \neq j}}^m a_k X_{kl} E_{lj} X_{ij} b_i$$ So we can write, $$ f = \frac{1}{2} a_i X_{ij} E_{jj }X_{ij} b_i + \sum_{\substack{k=1 \\ k\neq i}}^n \sum_{\substack{l=1 \\ l\neq j}}^m a_k X_{kl} E_{lj} X_{ij} b_i + g $$
where $g$ is not a function of $X_{ij}$.
Now lets derive $\pdv{ f}{ X_{pr}}$: $\frac{1}{2} a_i X_{ij} E_{jj }X_{ij} b_i$ does not contain $X_{pr}$; the second term is a linear function of $X_{pr}$ so we get $a_p E_{rj} X_{ij} b_i$; the third term is $\pdv{ g}{ X_{pr}}$. So we must have, $$ \pdv{ f}{ X_{pr}}= a_p E_{rj} X_{ij} b_i+ \pdv{ g}{ X_{pr}} = \sum_{k=1}^n \sum_{l=1}^m a_k X_{kl} E_{lr} b_p $$ The $X_{ij}$ term on the r.h.s. has coefficient $a_i E_{jr} b_p$; we have already established that $g$ is not a function of $X_{ij}$. So we must have $a_p E_{rj} b_i = a_i E_{jr} b_p$; this is possible only if $a=b$ and $E_{rj}=E_{jr}$ (i.e. the matrix is symmetric). So unless these conditions hold the expression you wrote does not have an antiderivative.