I'm having a problem of my own on Numerical Analysis, just some matrix transformation that I'm dealing with. I need to reduce this matrix to simpler form of dot product "$\cdot$" or cross product “$\times$” or tensor product "$\otimes$", since the double sum at each component makes calculation very difficult. $$ \begin{pmatrix} \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_j(x_i) & \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_j(x_{i+\tfrac{1}{n}}) & \dots & \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_j(x_{i+\tfrac{n}{n}}) \\ % \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_{j+\tfrac{1}{n}}(x_i) & \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_{j+\tfrac{1}{n}}(x_{i+\tfrac{1}{n}}) & \dots & \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_{j+\tfrac{1}{n}}(x_{i+\tfrac{n}{n}}) \\ % \vdots & \vdots & \ddots & \vdots \\ % \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_{j+\tfrac{n}{n}}(x_i) & \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_{j+\tfrac{n}{n}}(x_{i+\tfrac{1}{n}}) & \dots & \displaystyle\sum_{i=0}^{m_i}\sum_{j=0}^{m_j} g_{j+\tfrac{n}{n}}(x_{i+\tfrac{n}{n}}) \end{pmatrix} $$ In the above matrix, $g_{\alpha}(x)$ is some arbitrary function and $x_{\lambda}$ is given ($\alpha,\lambda\in\mathbb{R}$).
Can someone give me an idea on how I can do this?