These are all from Sipser's book, second edition. I was just hoping someone could verify/explain those that are more difficult for me.
$2n = O(n)$: true
$n^2 = O(n)$: false
$n^2 = O(n\log^2 n)$: I say false, because $\log^2 n$ grows more slowly than $n$
$n\log n = O(n^2$): true because $n$ grows more quickly than $\log^2n$
$3^n = 2^{O(n)}$: I wasn't sure what this notation meant. Any clarification here?
$2^{2^n} = O(2^{2^n})$: I would say true, but I'm not entirely sure why.
$n = o(2n)$: false, because $o(2n) = o(n)$ and $n$ grows as quickly as itself.
$2n = o(n^2)$: true, $n^2$ is always more than $2n$ for some constant.
$2^n = o(3^n)$: true
$1 = o(n)$: I'm not really sure.
$n = o(\log n)$: false, $n$ is always larger.
$1 = o(1/n)$: true, because 1 is always larger.
I want to add to LAcarguy's answer to develop some intuition. When looking at exponentials of the form $a^{f(n)}$ vs. $b^{f(n)}$, $a^{f(n)} \in O(b^{f(n)})$ when $a \leq b$. Note that this assumes $f(n)$ is an increasing, positive function. In computational complexity, this is usually a fair assumption, but not always.
Little-o is a strict inequality, whereas Big-O is a weak-inequality. This means that if $f(n) = o(g(n))$, then we have a positive constant $C$ such that $|f(n)| < C * |g(n)|$ as $n$ tends to infinity.
Using this definition, we quickly see that $1 = \frac{1}{n}$ only when $n = 1$. A constant of $C$ may allow for $\frac{C}{n} \geq 1$ for more values of $n$, but we still will only have finite values of $n$ for when this happens. Eventually, we see the $n$ term dominate out the constant; and thus, $\frac{1}{n} = o(1)$. The limit test that LAcarguy mentioned describes the mathematics behind this intution.
For your question about $2^{2^{n}} = O(2^{2^{n}})$, you are correct. Look at the definition of Big-O to see why. We need constants $C, k \in \mathbb{R}^{+}$ such that $2^{2^{n}} \leq C * 2^{2^{n}}$, for all $n \geq k$. Let $C = k = 1$. We have now satisfied the definition of Big-O.
An alternative way to look at $2^{2^{n}} = O(2^{2^{n}})$ is with Big-Theta. Big-Theta is an equivalence relation that says $f(n) = \Theta(g(n))$ if and only if we have a constant $C$ such that $|f(n)| = C * |g(n)|$ as $n$ tends to infinity. Setting $C = 1$ gives us equivalence of $f(n)$ to itself. This is the reflexive property of Big-Theta being an equivalence relation. Big-Theta has a couple of other equivalent definitions. $f(n) = \Theta(g(n))$ if and only if $f(n) = O(g(n))$ and $g(n) = O(f(n))$. Equivocally, $f(n) = O(g(n)) \leftrightarrow g(n) = \Omega(f(n))$. If you haven't learned about Big-Theta and this is confusing, feel free to ignore it. If it helps you with the intuition, by all means, use it.
Hopefully this helps clarify things, though!