How to prove computational cost of taking matrix powers

1.7k Views Asked by At

How would I go about proving the computational bound of an operation where I'm repeatedly taking a dot product between an $n\times n$ square matrix and its self for up to $n$ times? (Is this even the correct place for me to ask this question?)

1

There are 1 best solutions below

5
On BEST ANSWER

The computational complexity of matrix multiplication is between $O(n^{2.37})$ and $O(n^3)$, depending on naivety the algorithm you use. Let's use $O(n^k)$.

So if $A\in\mathbb{R}^{n\times n}$, and you compute $A^m$ by naive matrix multiplication, it will cost $O(n^km)$.

If you want to be a little smarter and compute the power by diagonalization, the complexity of finding the eigendecomposition is generally considered around $O(n^3)$, although it technically has no finite bound on the number of operations of course (e.g. see here or here). Then you can take the matrix power by just taking $n$ numbers to the $m$th power (at worst $O(nm)$ and can be better (e.g. here)) and doing $2$ matrix multiplications (negligible). So the cost will be more like $O(n^3 +nm)$, which can be much cheaper even in this naive setup.


Edit: here's some details on why naive matrix multiplication has that many operations.

Let $A\in\mathbb{R}^{n\times m}$, $B\in\mathbb{R}^{m\times k}$. Then $C=AB\in\mathbb{R}^{n\times k}$.

Standard matrix multiplication for one element is given by: $$ C_{ij} = \sum_{p=1}^m A_{ip} B_{pj} $$ How many operations is this (for one element)? It is $m$ additions and $m$ multiplications, i.e. $2m$ floating point operations. But there at $nk$ elements to fill. Thus, naive "dot product" multiplication requires $2nkm$ operations. If $n=k=m$, this is $O(n^3)$.

Now, if $A\in\mathbb{R}^{n\times n}$ and you want to compute $A^w=A\ldots A$ naively, it will take $w$ matrix multiplications, each costing $O(n^3)$. Hence, the cost is $O(n^3w)$.