Suppose $A(t,x)$ is a $n\times n$ matrix that depends on a parameter $t$ and a variable $x$, and let $f(t,x)$ be such that $f(t,\cdot)\colon \mathbb{R}^n \to \mathbb{R}^n$.
Is there a chain rule for $$\frac{d}{dt} A(t,f(t,x))?$$
It should be something like $A_t(t,f(t,x)) + ....$, what is the other term?
Yes, there is a chain rule for such functions. Before getting to that, let's just briefly discuss partial derivatives for multivariable functions.
Note that this is almost word for word the same definition you might have seen before (or atleast if you think about it for a while, you can convince yourself it's very similar). THe idea is of course that we fix all but the $i^{th}$ variable, and then consider the derivative of that function at the point $a_i$. Next, we need one last bit of background.
Anyway, the chain rule in this case is as follows: \begin{align} \dfrac{d}{dt}A(t, f(t,x)) &= \dfrac{\partial A}{\partial t}\bigg|_{(t,f(t,x))} + \dfrac{\partial A}{\partial x}\bigg|_{(t,f(t,x))} \left[ \dfrac{\partial f}{\partial t}\bigg|_{(t,x)}\right] \tag{$*$} \end{align} What does this mean? Well, on the LHS, we have a function $\psi: \Bbb{R} \to M_{n \times n}(\Bbb{R})$, defined by \begin{align} \psi(t):= A(t,f(t,x)) \end{align} and we're trying to computing $\psi'(t)$. On the RHS, note that $A: \Bbb{R} \times \Bbb{R}^n \to M_{n \times n}(\Bbb{R})$. So, the first term is $\dfrac{\partial A}{\partial t}\bigg|_{(t,f(t,x))} \in M_{n \times n}(\Bbb{R})$, which is exactly what you predicted.
Now, how do we understand the second term? Again, note that $A$ maps $\Bbb{R} \times \Bbb{R}^n \to M_{n\times n}(\Bbb{R})$. So, $\dfrac{\partial A}{\partial x}\bigg|_{(t,f(t,x))}$ is the partial deriavtive of $A$ with respect to the variables in $\Bbb{R}^n$ (i.e we're considering $V_1 = \Bbb{R}$ and $V_2 = \Bbb{R}^n$, so it's the $2$nd partial derivative of $A$), calculated at the point $(t,f(t,x)) \in \Bbb{R} \times \Bbb{R}^n$ of its domain. Note that this by definition is a linear map $\Bbb{R}^n \to M_{n \times n}(\Bbb{R})$. We are now evaluating this linear transformation on the vector $\dfrac{\partial f}{\partial t}\bigg|_{(t,x)} \in \Bbb{R}^n$ to finally end up with the matrix $\dfrac{\partial A}{\partial x}\bigg|_{(t,f(t,x))} \left[ \dfrac{\partial f}{\partial t}\bigg|_{(t,x)}\right] \in M_{n \times n}(\Bbb{R})$. This is how to read the notation in $(*)$.
If for some reason you don't like to think in terms of linear transformations, here's an alternative approach, in a simplified case, using Jacobian matrices (but I just don't like such a presentation). Suppose that $A$ is a function $A : \Bbb{R} \times \Bbb{R}^n \to \Bbb{R}^m$, and $f: \Bbb{R} \times \Bbb{R}^n \to \Bbb{R}^n$. Then, we can say \begin{align} \dfrac{d}{dt} A(t, f(t,x)) &= (\text{Jac}_{\Bbb{R}} A){(t,f(t,x))} + (\text{Jac}_{\Bbb{R}^n}A)(t, f(t,x)) \cdot \dfrac{\partial f}{\partial t}\bigg|_{(t,x)}\\ &=\dfrac{\partial A}{\partial t}\bigg|_{(t,f(t,x))} + (\text{Jac}_{\Bbb{R}^n}A)(t, f(t,x)) \cdot \dfrac{\partial f}{\partial t}\bigg|_{(t,x)}. \end{align} Note that the Jacobian matrix of $A: \Bbb{R}\times \Bbb{R}^n \to \Bbb{R}^m$ evaluated at the point $(t,f(t,x)) \in \Bbb{R}\times \Bbb{R}^n$, denoted by $(\text{Jac }A)(t, f(t,x))$ is an $m \times (1 +n)$ matrix. So, when I say $(\text{Jac}_{\Bbb{R}}A)(t, f(t,x))$, I mean the $m \times 1$ submatrix obtained by taking the first column (so that we only keep track of the derivative with respect to the $\Bbb{R}$ variable, i.e with respect to $t$). You see, this is just a vector in $\Bbb{R}^m$. Next, when I say $(\text{Jac}_{\Bbb{R}^n}A)(t, f(t,x))$, I mean the $m \times n$ submatrix obtained by ignoring the first column (so that we only keep track of the derivative with respect to the $\Bbb{R}^n$ variables). Then, we multiply this $m \times n$ matrix by the $n \times 1$ vector $\dfrac{\partial f}{\partial t}\bigg|_{(t,x)}$, to finally get a $m \times 1$ matrix, or simply a vector in $\Bbb{R}^m$.
The reason I don't like this approach is because in your case, the target space is $M_{n \times n}(\Bbb{R})$, so it is not natural to think of it as $\Bbb{R}^m$. I mean sure, you could construct an isomorphism to $\Bbb{R}^{n^2}$, but this requires a certain choice of basis in order to "vectorize" a matrix. But then in the end you will probably want to "undo" the vectorization, and then the whole thing is just a mess. Doable, but I think it's very adhoc, and that it's much cleaner to treat everything as linear transformations, because then it doesn't matter what the domain or target space are... it's pretty much linear algebra from here.
To hopefully convince more about the generality (and simplicity) of the linear transformations approach, let $V,W$ be normed vector spaces, $A: \Bbb{R} \times V \to W$ be a differentiable map, and let $f: \Bbb{R} \times V \to W$ be differentiable. Then, \begin{align} \dfrac{d}{dt} \bigg|_t A(t,f(t,x)) &= \dfrac{\partial A}{\partial t} \bigg|_{(t,f(t,x))} + \dfrac{\partial A}{\partial x} \bigg|_{(t,f(t,x))}\left[ \dfrac{\partial f}{\partial t}\bigg|_{(t,x)}\right] \in W \end{align} i.e, the formula for the chain rule stays exactly the same, regardless of what vector spaces $V,W$ are. But if you insist on thinking of everything in terms of Jacobian matrices, you're going to have a tough time first constructing isomorphisms $V \cong \Bbb{R}^n$ and $W \cong \Bbb{R}^m$, and then doing everything in the cartesian spaces, and then "undoing" the isomorphisms, to reexpress everything back in terms of the spaces $V$ and $W$.
Or of course, another way to think of it is to express everything in terms of component functions of the matrix-valued function $A$: \begin{align} \dfrac{d}{dt}A_{ij}(t,f(t,x)) &= \dfrac{\partial A_{ij}}{\partial t}\bigg|_{(t,f(t,x))} + \sum_{k=1}^n\dfrac{\partial A_{ij}}{\partial x_k}\bigg|_{(t,f(t,x))} \cdot \dfrac{\partial f_k}{\partial t}\bigg|_{(t,x)} \end{align} (all these partial derivatives being real numbers). But of course, for obvious reasons, this component-by-component approach can get very tedious very quickly (and doesn't generalize well), and also it didn't seem to be what you really wanted to ask about, which is why I'm mentioning it at the end.