I'm looking for some references on the complexity for the following kind of problem:
Given two Basis $(a_1, ... ,a_n)$ and $(b_1, ..., b_n)$ of the $K(x)$-vector space $V$ I want to compute the transformation matrix $M \in K(x)^{n \times n}$ which satifies the equation $$ (b_1, ..., b_n) \cdot M = (a_1, ... ,a_n).$$
Either I simply don't know the name of this kind of problem or there is another rather trivial reason that I can't find any references. There has been much effort in scientific research solving linear equations of the form $$ Ax = b $$ for given $A$ and $b$, hence I can't see any reason why not to do the same for my desired problem. Maybe there is a direct link between those tasks or any other reduction of my problem to another well known problem for which complexity analysis has been done.
Edit: An (for my case) equivalent problem is to determine the matrix $M$ such that $M$ satisfies $$ M_b \cdot M = M_a $$ where $M_a, M_b$ are representation matrices for $(a_1,...,a_n)$ and $(b_1,...b_n)$ for a fixed basis of the regarded $K(x)$ vector space. Therefore with polynomial matrix inversion I would achieve my goals, unfortunately this has time complexity $O^{\sim}(n^3 d)$ where $d$ denotes the maximal degree of the entries of the considered matrix (cf. On the Complexity of Polynomial Matrix Computations). But the complexity I'm looking for should be $O^{\sim}(n^\omega d)$.
I would appreciate any kind of help, recommendations or hint for the name of this problem. Thank you very much!