If $f(n)=2^{n+1}$ and $g(n)=2^n$, prove $f(n)=O(g(n))$

61 Views Asked by At

Using the basic definition of big-O notation prove that if $f(n)=2^{n+1}$ and $g(n)=2^n$, then $f(n)=O(g(n))$.

I came across two answer to this question on this website but the answers weren't clear to me. Would you mind to elaborate how this can be proven? I am first year student of computer sciences. Thank you!

2

There are 2 best solutions below

0
On

$\forall n\in\mathbb{N},f(n)=2g(n)$, thus the sequence $\left(\frac{f(n)}{g(n)}\right)$ is bounded and $f(n)=\mathcal{O}(g(n))$.

2
On

As an alternative to Tuvasbien's answer, this is more in line with the definition I typically see in first year CS courses:

Recall $f \in O(g)$ if and only if (for some fixed choices of $C$ and $x_0$) we have $f(x) \leq Cg(x)$ for every $x \geq x_0$.

Notice $f = 2*g$, so $f(x) \leq 3*g(x)$ for every $x \geq 1$. This is exactly the definition of $f \in O(g)$.


I hope this helps ^_^