Method to Multiply many Matrices simultaneously

1.6k Views Asked by At

Imagine you have three random matrices $$ A = \left[ \begin{array}{cc} 1 & 2 \\ 3 & 4 \end{array} \right] ; \quad B = \left[ \begin{array}{cc} 3 & 4 \\ 5 & 6 \end{array} \right]; \quad C = \left[ \begin{array}{cc} 5 & 6 \\ 7 & 8 \end{array} \right]$$

Now if someone asks me what is $ABC$, I'd be doing the following: $$ ABC = (AB)C =\left[ \begin{array}{cc} 3 + 10 & 4 + 12 \\ 9 + 20 & 12 + 24 \end{array} \right] \cdot \left[ \begin{array}{cc} 5 & 6 \\ 7 & 8 \end{array} \right] = \left[ \begin{array}{cc} 177 & 206 \\ 397 & 462 \end{array} \right] $$ If I had $4$ matrices, I'd be doing $((A\cdot B)\cdot C)\cdot D$ and if I was smarter $(A\cdot B)\cdot (C \cdot D)$

And what about raising a matrix to some power. I'd be running around with same matrix, tactically trying to cut down on calculations by seeing repetitive patterns. Sometimes, it's just too boring.

Now as natural as our classical method may seem, for me, it's a bit clunky to deal with on paper. At the end of multiplication, I feel that all the water is going into one single bucket and I wasted my efforts diluting stuff in other buckets.

My question is simple, can I pour all the water into the same bucket in one go? That is, if I had a certain number of matrices to multiply, is there some way to multiply all of them together in one single step?

2

There are 2 best solutions below

0
On BEST ANSWER

So suppose you have $M$ arbirary matrices $n$ by $n$ matrices, ($A_1, A_2, ... A_M$), and their product is $C = A_1 \times A_2 .... \times A_M$, and you want to find the value of $C_{i,j}$ without doing the intermediate multiplication. It is actually conceptually useful to know how to do that.

You probably know that if $C = A \times B$ then: $$C_{i,j} = \sum_{k=1}^n A_{i,k}B_{k,j}$$

When you extend this concept to lots of matrices being multiplied:

$$C_{i,j} = \sum_{k_1=1}^n \sum_{k_2=1}^n \dots \sum_{k_{M-1}=1}^n A_{1\,i,k_1}\,A_{2\,k_1,k_2}\,A_{3\,k_2,k_3}\,A_{4\,k_3,k_4}\,\dots\, A_{M\,k_{M-1},j}$$

How many products is that being added? Each sum is $n$ times more products, so $n \times n \times n \dots = n^{M-1}$. Conceptually, this is every single path through the products.

Real example:

Suppose you have $40$ matrices to multiply together, all of them $2 \text{ by } 2$ matrices.

If you do it the "classical way" (as you describe it), thats 39 matrix multiplications, or $4 \times 39 \times 1 = 156$ additions and $4 \times 39 \times 2 = 312$ multiplications.

If you do it the "all at once" way, there will be $2^{39} = 549755813888$ additions and $2^{39} \times 39 = 21440476741632$ multiplications.

This is why linear algebra is such an amazingly useful study. "Every single path" comes up in a lot of problems (transition matrices, graphs), and if you approach it directly computations can take exponential amounts of time. But writing things as matrices and using the classical approach drops the problems into polynomial time.

3
On

A quick note:

You say that if you were smarter, you would calculate $(A\cdot B)\cdot (C\cdot D)$ instead of $((A\cdot B)\cdot C)\cdot D$, but I see no real difference here. In both cases, you must perform $3$ matrix multiplications. In the first case, you calculate $X=A\cdot B$ and $Y=C\cdot C$ and then the result as $Z=X\cdot Y$. In the second case, you calculate $X=A\cdot B$, then $Y=X\cdot C$ and then $Z=Y\cdot D$.

As for your question, as far as I know, there is no way around performing $2$ multiplications when calculating $A\cdot B\cdot C$. The only thing where you may actually save some work is if you are calculating $A^n$, where, for example, you can calculate $A^4$ by first calculating $A^2$ and then squaring the result (this means $2$ matrix multiplications instead of $3$).