The bound $2^{O(n)}$ seems to come up a lot, and I'm trying to get a feel for it.
If $f(n) = O(2^n)$ then there exist $n_0$ and $c>0$ such that $f(n) \le c2^n$ for all $n \ge n_0$.
If $g(n) = 2^{O(n)}$ then there exist $m_0$ and $d>0$ such that $\lg g(n) \le dn$ for all $n \ge m_0$.
So $g(n) \le 2^{dn}$ for all $n \ge m_0$.
If this is correct, how can we compare $f$ and $g$ above? I.e. what is "worse", $O(2^n)$ or $2^{O(n)}$? Is there any sort of intuition I should have about these bounds?
If you are trying to bound the growth rate of $f(n)$, the bound $f(n) = 2^{O(n)}$ is significantly worse than the bound $f(n) = O(2^n)$. $O(2^n)$ is actually a set - the set of functions $f(n)$ so that there is a constant $M$ and an integer $n_0$ so that $f(n) < M \cdot 2^n$ for $n > n_0$. Meanwhile, the set $2^{O(n)}$ is the set of functions $f(n)$ so that there exist $M, n_0$ and an $O(n)$ function $g(n)$ so that $f(n) < M \cdot 2^{g(n)}$ for $n > n_0$. The latter condition is weaker, since $g(n)$ could be $mn$ for any fixed $m$ which gives a bound $f(n) < M \cdot 2^{mn} = M \cdot (2^m)^n.$ So in the latter case, $f(n)$ could be $3^n$ or $100^n$ which grows asymptotically much faster.
It is true that $f(n) = 2^{O(n)}$ could mean that $f(n) = 2^{n/2}$, but $f(n)$ could also be $2^{n/2}$ if we know the sharper bound $f(n) = O(2^{n})$. Similarly some functions that are $O(n^2)$ grow asymptotically faster than some functions that are $O(n^3)$ (e.g., $n^2$ is $O(n^2)$ and $n$ is $O(n^3)$.) But we still say that $O(n^3)$ is a worse bound than $O(n^2)$.
The language here is probably confusing, since we write $f(n) = O(g(n))$, by an accepted abuse of notation, to mean that $f(n) \in O(g(n))$. Viewed as sets, it's clear that $2^{O(n)}$ is a strictly larger set than $O(2^n)$.