Multiplying two matrices $A \cdot B$ of size $n \times n$ in the trivial way requires $n^3$ computations. However, more efficient algorithms such as the Strassen algorithm have a lower complexity of approximately $\mathcal{O}(n^{2.8})$. There's even the Coppersmith-Winograd algorithm of theoretical complexity $\mathcal{O}(n^{2.37})$.
It seems unlikely that there would be for example a linear function that computes the matrix product, but is there a theoretical limit as to how effective any algorithm to multiply matrices must be, would for example $\mathcal{O}(n^2)$ be possible?
I'm also interested in extending the question beyound matrix multiplication. Are there such "theoretical lower bounds on complexity" for accomplishing a certain task with numerical methods (like multiplying two matrices) in general?
There is a survey paper by Virginia Vassilevska Williams who showed the currently best bound of $\omega \leq 2.373$.
In the introduction she says:
Thus it seems that we don't know more than $2 \leq \omega \leq 2.373$.