I visit a course about complexity theory but I have some troubles to prove a Big-Oh equation like this:
$O(2^{2n}) = O(2^n)$
$O(g(n))$ is a set of functions that fulfill following definition: The function $f(n)$ is an element of $O(g(n))$ if there are positive constants $c$ and $n_0$ such that $f(n) \leq c \cdot g(n)$ for all $n \geq n_0$.
I already had some exercies in the form of $f(n) = O(g(n))$ but never $O(f(n)) = O(g(n))$. Is there any other approach to prove equations like that? I read in a book that some $O(g(n))$ term occures in an exponent like $2^{O(g(n))}$ it is the same as $2^{c\cdot g(n)}$. So, can I do the same for equations like that above? My approach is this:
$O(2^{2n}) \in O(2^n)$
$2^{2n} \leq c \cdot 2^n$
Using $log_2$ I can write
$2n \cdot log_2(2) \leq c \cdot n \cdot log_2(2)$
And because of $log_2(2) = 1$:
$2n \leq c \cdot n$
Now I can set $c = 2$ and $n_0 = 1$ and shows that $O(2^{2n}) = O(2^n)$. Is this correct?
The book I read is "Introduction to the theory of Computation" by Michael Sipser.
$O(f (n)) = O(g (n))$ is a shortcut for "every function in the set $O (f (n))$ is also a member of the set $O(g (n))$". Now if you proved that f (n) is in $O(g (n))$, you can show quite easily that every element of the set $O (f (n))$ is also a member of the set $O(g (n))$.
You can't just take the logarithm on both sides. That might prove that $\log f (n)$ is an element of $O (\log g (n))$, but that's absolutely not the same as $f (n) = O (g (n))$.
Looking at the problem "$O(2^{2n})=O(2^n)$", you first need to decide whether you want to prove or disprove it. To me, $2^{2n}$ looks a lot bigger the $2^n$. And indeed, no matter how large you pick c, you can pick an n such that $2^{2n} = 2^n 2^n > c 2^n$; this is the case as soon as $2^n > c$.