How to prove that a function f(x) is O(g(x)), using the definition (finding C and k)

1.5k Views Asked by At

We say that $f(x)$ is $O(g(x))$ if $$(∃C ∈ \mathbb(R)❘)(∃k ∈ \mathbb(R)❘)(∀x ∈ \mathbb(R)❘)$$ $$(x ≥ k → |f(x)| ≤ C · |g(x)|)$$ In English: We can find $C$ and $k$ so that, once we get past the “small values” of $x$, $f(x)$ is at most a constant times $g(x)$. Usually, $f(x)$ will be complicated, and $g(x)$ will be simple.

These are all the notes we've been given in class about proofs using big-O notation and our professor has basically parroted the same thing back to us when it's been taught. Most everything I can find is too complicated or involved to even apply to what we're using it for, but apparently it's a very important concept that I'm failing to grasp. Is there any way someone knows to explain this in a simple fashion that makes sense to me? That sounds like a lot to ask, but I'm pretty desperate. I've never gotten a proof correct in this class and it drives me insane. So I need to know how to prove something using the definition of Big-O notation by finding $C $ and $k$.