So I have this is an assignment for algorithms. I've googled a lot, read the chapter in the book about big Oh notation, and I understand the concept. I do not however understand how to prove it. I know I need to work with a constant and put that out of the equation. And I know that the answer I've found for $f(n) = O(g(n))$ implies $g(n) = O(f(n))$ is
NO. $f(x)=x$,$g(x)=x^2$ is a counterexample.
But I do not understand this answer. If $g(x)$ could be $x^2$ then $f(n)$ would not be equal the $O(g(n))$.
Can someone explain in a simple way why this is the case though? :(
The definition of the big-oh notation is as follows :
This is why $f(x)=x$ and $g(x)=x^2$ is a counter-example :
$x=O(x^2)$ because for example taking $c=1$ we have $x \leq x^2$ for every $x \geq 1$
$x^2$ can't be $O(x)$ because that would mean that $x^2 \leq cx$ for every $x \geq x_0$ or $x^2-cx \leq 0$ but this is false because : $$\lim_{x \to \infty} x^2-cx=\infty$$
The usual intuition is that :
This explains why $x=O(x^2)$ but the reverse isn't true .