Representation Matrix computation time estimation

37 Views Asked by At

Abstract/general version of the question

If we have a matrix $$ M_z = \begin{pmatrix} \sum_{j=1}^{n}\lambda_j \lambda_{j,1,1} &\dots & \sum_{j=1}^{n}\lambda_j \lambda_{j,n,1}\\ \sum_{j=1}^{n}\lambda_j \lambda_{j,1,2} &\dots & \sum_{j=1}^{n}\lambda_j \lambda_{j,n,2}\\ \vdots & & \vdots\\ \sum_{j=1}^{n}\lambda_j \lambda_{j,1,n} &\dots & \sum_{j=1}^{n}\lambda_j \lambda_{j,1,1} \end{pmatrix}, $$ $M_z \in k[x]$ with polynomial entries which has the form of a general matrix product of two $n \times n$ matrices, is it possible to compute this matrix in time $O(dn^{\omega})$ where $O(d)$ denotes the effort of the arithmetic operations, i.e. multiplication and addition of the polynomials $\lambda_j \lambda_{j,i,m}$. Further $O(n^{\omega})$ denotes the effort for the matrix multiplication of two matrices. Is this possible only beacause of the matrix having the general form of a matrix product?

More specific

Let $F$ be an algebraic function field, $n = [F:k(x)]$ for a separating element $x$ which is transcendence over $k$. Now we define $R = Cl(k[x],F)$ to be the integral closure of $k[x]$ in $F$. $R$ is a $k[x]$-Module of rank $n$ and let $\omega_1,\ldots,\omega_n$ be an integral Basis of $R$. For any integral ideal $I \leq R$ (which is a $k[x]$-Module of rank $n$ too) and any $k[x]$-Module basis $\alpha_1,\ldots,\alpha_n$ of $I$, there exists a unique representation matrix $M_I$ which satisfies $$(\alpha_1,\ldots,\alpha_n) = (\omega_1,\ldots,\omega_n) M_I.$$ In the case of a principal ideal $I=Rz$ we have $$(z\omega_1,\ldots,z\omega_n) = (\omega_1,\ldots,\omega_n) M_z.$$ For the computation of products of elements in $R$ we have the so called uniquely determined multiplication tables $\lambda_{i,j,m} \in k[x]$ satisfying $$\omega_i \omega_j = \sum_{m=1}^{n} \lambda_{i,j,m} \omega_m,$$ considering the basis $(z\omega_1,\ldots,z\omega_n)$ for $Rz$ and $z$ having the form $z = \sum_{i=1}^{n}\lambda_i \omega_i$ the representation matrix is the following: $$ M_z = \begin{pmatrix} \sum_{j=1}^{n}\lambda_j \lambda_{j,1,1} &\dots & \sum_{j=1}^{n}\lambda_j \lambda_{j,n,1}\\ \sum_{j=1}^{n}\lambda_j \lambda_{j,1,2} &\dots & \sum_{j=1}^{n}\lambda_j \lambda_{j,n,2}\\ \vdots & & \vdots\\ \sum_{j=1}^{n}\lambda_j \lambda_{j,1,n} &\dots & \sum_{j=1}^{n}\lambda_j \lambda_{j,1,1} \end{pmatrix} $$ We have the following assumptions: Both, $\lambda_j$ and $\lambda_{i,j,m}$, have degree in $O(d)$ where $d=g/n$ and $g$ denotes the genus of the function field $F$. Further we can assume that the multiplication of $\lambda_j\lambda_{i,j,m}$ only takes $O(d)$ steps.

To show

Now we have to argue that the time for computation of the representation Matrix $M_z$ only takes $O(dn^{\omega})$ (which is true), where $O(n^{\omega})$ denotes the time for the matrix computation of two $n \times n$ matrices.

Ideas

  • In casual speaking terms one could argue that $M_z$ has in general the form of a matrix multiplication of two $n \times n$ matrices and therfore the costs for the computation takes time in $O(n^{\omega})$ and each arithmetic operation, i.e. the multiplication/addition of those $\lambda_j \lambda_{i,j,m}$, requires $O(d)$, therefore we could conclude a total effort $O(dn^{\omega})$.
  • One could try to find a representation of $M_z$ as a product of two $n \times n$ matrices. But there are $n^3$ multiplication tables $\lambda_{i,j,m}$ and therefore it would be impossible to be successful in finding such matrices.