An exercise solution claims that for f(n) = Sum(1:n) , g(n) = n^2 it holds that f isIn O(g) and g isIn O(f).
I don't understand why this is, as it seems to me that f isIn O(n) and g isIn O(1), because f contains n loop iterations while g contains a single multiplication regardless of the value of n. Could someone please explain this to me?
(This answer assumes that sum(1:n) means $1 + 2 + \cdots + n$).
When we say that $f(n) = O(g(n))$, we mean that $f(n)$ as a function is asympotically at most a constant multiple of $g(n)$. Intuitively, it means that $g(n)$ grows at least as fast as $f(n)$. Formally, it means something like $\limsup \frac{f(n)}{g(n)} < \infty$.
This does not say anything about the time it takes to compute $f$ and $g$. If we want to say that one algorithm $A$ takes less time to compute than another algorithm $B$, we would first let $f(n)$ be the "runtime" for $A$ and $g(n)$ be the runtime for $B$, and then we would say that $f(n) = O(g(n))$. We would NOT say that $A = O(B)$.
TL;DR: a function is not the same as an algorithm. Big-oh notation is used to compare functions, which represent run-times.