Is there a limit for how "good" a numerical method can be?

98 Views Asked by At

Multiplying two matrices $A \cdot B$ of size $n \times n$ in the trivial way requires $n^3$ computations. However, more efficient algorithms such as the Strassen algorithm have a lower complexity of approximately $\mathcal{O}(n^{2.8})$. There's even the Coppersmith-Winograd algorithm of theoretical complexity $\mathcal{O}(n^{2.37})$.

It seems unlikely that there would be for example a linear function that computes the matrix product, but is there a theoretical limit as to how effective any algorithm to multiply matrices must be, would for example $\mathcal{O}(n^2)$ be possible?

I'm also interested in extending the question beyound matrix multiplication. Are there such "theoretical lower bounds on complexity" for accomplishing a certain task with numerical methods (like multiplying two matrices) in general?

1

There are 1 best solutions below

0
On BEST ANSWER

There is a survey paper by Virginia Vassilevska Williams who showed the currently best bound of $\omega \leq 2.373$.

In the introduction she says:

The best known lower bounds for the problem are of the form $c \cdot n^2$ for explicit constants $c \leq 3$. Perhaps because of this, a majority of experts believe that there should exist an essentially quadratic time algorithm for matrix multiplication. Some prominent experts, however, such as Strassen, have also expressed the opinion that matrix multiplication requires strictly more than quadratic time. The answer has evaded computer science researchers for more than 40 years!

Thus it seems that we don't know more than $2 \leq \omega \leq 2.373$.