Why in formulas a return value of a function sometimes shown as an argument?

120 Views Asked by At

Sorry for a perhaps newbie question, I had a hard time in the school.

Well, the title says the problem, let's look at example, which I stole from the coursera video-lectures about an algorithms:

Claim: if $T(n) = a_{k}n^{k}+...+a_{1}n+a_{0}$ then $T(n)=O(n_{k})$

It is a formula of one of a videos, you may see here that $T(n)$ used as usual, it is a declaration of a function $T(n)$, which takes one argument n. That's pretty clear.

As you probably know the so called Big-Oh $O()$ returns a speed of an algorithm(in a worst case, but for now this doesn't matter). I can understand for now that in the statement $T(n)=O(n_{k})$ the autor wanted to say: "speed of the function $T(n)$ is equal to $n^{k}$". But it is absolutely not what I see! I see this "A result of a function $T(n)$ is equal to speed of calculating $n^{k}$". Because Big Oh is a function that takes a function as an argument, and returns it's speed. So this formula should be written as $O(T(n) )=n^{k}$ (or we may neglect an argument, as we know that the T is a function of one argument, and then the formula going to look like $O(T)=n^{k}$).

It is pretty confusing. If in the begin of a lectures I could somehow guess what the author talking about, then some videos later I stuck; I have no time to understand all the calculations of the autor(mostly because I am not Englishman), then I am trying to look at a pictures, and all the mess just blows my mind!

3

There are 3 best solutions below

12
On BEST ANSWER

Big-O is commonly used for complexity analysis of algorithms, as you said, but it actually comes from mathematics, in the study of asymptotic growth.

For example, when I say in English "sorting $n$ integers can be done in $O(n\log n)$ time", it means more precisely the following:

Let $T(n)$ be the amount of time (either in seconds, or in amount of computer operations) it takes to sort $n$ integers. Then $T(n) = O(n \log n)$.

To understand the last part of the above we need to understand what big O means.

And this is the definition: A function or mathematical expression is said to be $O(\text{something})$, if there exists a positive constant $C$ such that the function or expression does not exceed $C$ times $\text{something}$.

In the above example, with sorting integers, this means:

There exists a constant $C>0$ such that it takes no more than $C\cdot n\cdot \log n$ seconds to sort $n$ integers.

And in the example in your question, it means:

Suppose $T(n) = a_k n^k + \ldots + a_1 n + a_0$. Then there exists a constant $C > 0$ such that $T(n) \le C n^k$.

1
On

Big-Oh does not inherently measure speed. It describes the overall growth of arbitrary functions as their argument(s) tends to infinity (or sometimes towards a different point). One of the main applications of Big-Oh is complexity theory, where the functions considered could be the running time of an algorithm as a function of its input, or the memory footprint of an algorithm as a function of its input. So when we say $f(n)=O(g(n))$ (which by the way is an abuse of the equals sign, but hey it's established), we only say that there exists a constant $c$ such that $|f(n)|<cg(n)$ for almost all $n\in\mathbb N$.

2
On

Others already pointed to the definition of $O(x)$. I'll try to write it in a different perspective.

$O(x)$ is not a function. It's a set of functions represented by a single base function like $x$, $x^n$, $1/x$, etc. These functions within the same set are similar to how fast they grow such that they can be treated being almost the same in analysis of algorithms since we assume that algorithms will take in a extremely large numbers.

For example, $x^2$ and $2x^2$ increase just about the same rate such that for huge $x$, we can just discard that constant and say both functions are $O(x^2)$.