I have only rough idea about computational complexity and Big-O notation.
Is these idea of algorithm applicable in mathematics too, or only in computer science?
If applicable in mathematics too,
- Why the complexity of an expression similar to $n(n-1)$ is $O(n^2)$, not something like $O(n^2)\land O(n)$? Is it because how machines work? I mean we calculate $13^2$ from our memory, and for $31^2$ we use rules of multiplication, but machines need to add $31$ to $0$ for $31$ consecutive times, and hence most of the computational time is spent there, is it the reason? Even if such reasons apply, we know that the complexity of an expression similar to $n+5$ is $O(n)$, not $1$. Hence $O(n)$ is not negligible. Perhaps an explanation of the complexity of $n^3+n^2+n$ would clear things up.
- What is the complexity of $(41n)^2-1$? Why?
- What is the complexity of $n!/(n-2)!$ By cancelling out $(n-2)!$, we can have $O(n^2)$. But if we do not cancel out, I found that the complexity of factorial is $O(n^n)$. (I suppose machines don't have the idea of cancelling out yet.)
Yes, these are used in mathematics as well as computer science. We may need to specify a "limit point" when talking about them. In addition to $f(n) = O(g(n))$ as $n \to \infty$, we also see rates given when approaching some finite value: $f(x) = O(g(x))$ as $x \to 3$.
Proofs for these: $$ \lim_n \frac{n(n-1)}{n^2} = 1 < +\infty, \\ \lim_n \frac{n(n-1)}{n^2+n} = 1 < +\infty, \\ \lim_n \frac{n(n-1)}{4n^2-3n+7} = \frac{1}{4} < +\infty, \\ \lim_n \frac{n(n-1)}{n^2+n^{3/2}+n+n^{1/2}} = 1 < +\infty, \\ \lim_n \frac{n(n-1)}{932n^2-12n+\pi} = \frac{1}{932} < +\infty, $$
As $n \to \infty$ we have $$ \frac{(41n)^2-1}{n^2} = \frac{1681 n^2 - 1}{n^2} = 1681 + \frac{-1}{n^2} \to 1681 $$ Any finite limit (or even lim sup) will be OK.
Sometimes this notion of "better choice" is specified by saying $$ \frac{n!}{(n-2)!} = \Theta(n^2) $$ which means: $$ \text{both}\quad \frac{n!}{(n-2)!} = O(n^2), \quad\text{and}\quad n^2 = O\left(\frac{n!}{(n-2)!}\right) $$