Using the basic definition of big-O notation prove that if $f(n)=2^{n+1}$ and $g(n)=2^n$, then $f(n)=O(g(n))$.
I came across two answer to this question on this website but the answers weren't clear to me. Would you mind to elaborate how this can be proven? I am first year student of computer sciences. Thank you!
$\forall n\in\mathbb{N},f(n)=2g(n)$, thus the sequence $\left(\frac{f(n)}{g(n)}\right)$ is bounded and $f(n)=\mathcal{O}(g(n))$.