I just started an algorithm design and analysis class and i'm trying to learn how to properly prove things using big O. I was never taught much about big O in my previous CS classes other than the fact that the dominant term is the point of focus. I am trying to learn the proper way to prove things like this but everything I read confuses me even more. I know that:
$2^{n+1} = 2 * 2^n = O(2^n)$
But I don't think this is not enough to prove this. From other examples i've seen, it seems I need to use a constant c to finish this proof. Is this the correct thought process? Are there other ways to prove something like this? What is the most efficient method of doing so?
If someone could help me understand the process, I would be so grateful. I've seen different ways of proving similar problems but i'm not sure which ways are correct.
Thanks so much in advance.
Ok, so look at this page for the definition of Big O.
What does it say? Well, $f(n)$ is said to be $O(g(n))$ if there exist constants $c$ and $k$ such that if $l \geq k$ then $f(l) \leq c \times g(l)$.
In words : $f(n)$ is $O(g(n))$ if eventually, a constant multiple of $g$ is bigger than or equal to $f$.
So, why is it that $2^{n+1}$ is $O(2^n)$? Is it true, that eventually, a constant multiple of $2^n$ is always greater than $2^{n+1}$? Well, let's say we multiply $2^n$ by $2$,then it is $2^{n+1}$, which of course, is greater than or equal to $2^{n+1}$!
So with $c=2$ and $k=1$, we have $2 \times 2^n \geq 2^{n+1}$ for all $n \geq 1$. Therefore , $2^{n+1}$ is $O(2^n)$.
If you have not understood , you may ask.