Much of the literature on matrix calculus deals with derivatives.
Let $x \in \mathbb{R}^{n}$, and $A \in \mathbb{R}^{n \times n}$. It is known, for example, that:
$$f(x) = Ax \implies \nabla_x f(x) = A \tag{1}$$ $$f(x) = x^TAx \implies \nabla_x f(x) = (A + A^T)x \tag{2}$$
How do I do the opposite? In particular, if I have $f(x) = Ax$, how can I know the antiderivative?
If $A$ were symmetric, I could perhaps find a matrix $P$ such that $A = P + P^T$, which would let me use the statement $(2)$ to claim that $\int dx^T Ax = x^TPx$.
What could be said about the integral for a generally asymmetric matrix A?
Several mistakes.
i) You deal with gradient and not with derivative.
ii) Note that $x\in\mathbb{R}^n\rightarrow Ax\in\mathbb{R}^n$. Generally, the gradient is defined for functions with real values; for example $(Ax)_i=L_ix$ where $L_i$ is the $i^{th}$ row of $A$; then $\nabla (Ax)_i=L_i^T$. Yet, we can extend the definition, for example, as follows: $\nabla(Ax)=(\nabla(Ax)_i)_i$.
iii) If you want the antiderative of $Ax$, it suffices to calculate the antiderivative of $(Ax)_i=L_ix$. But what do you mean by "antiderivative" ?
We can consider that it is an integral with $n$ variables. By an easy calculation, $\int_{[0,u_1]\times\cdots\times [0,u_n]}L_ixdx=\dfrac{u_1\cdots u_n}{2}L_i[u_1,\cdots,u_n]^T$.
OR we can suppose that the $(x_i)$ are functions of $t$. Then
$\int_{0}^t Ax(u)du=A\int_{0}^t x(u)du$.
iv) Your last three lines are non-sense.
EDIT. Perhaps you write "$g$ is an antiderative of $Ax$" for $\nabla(g)=Ax$.
Then,
$\dfrac{\partial g}{\partial x_i}=\sum_k a_{i,k}x_k,\dfrac{\partial g}{\partial x_j}=\sum_k a_{j,k}x_k$.
Then $\dfrac{\partial^2 g}{\partial x_i\partial x_j}=a_{i,j}$, $\dfrac{\partial^2 g}{\partial x_j\partial x_i}=a_{j,i}$. Necessarily $a_{i,j}=a_{j,i}$ and $A$ is symmetric.
When $A$ is symmetric, all the solutions can be written: $g=1/2x^TAx+$ constant.