I read the definition that $f$ is in $O(x^n)$ if $|f(x)|<C|x^n|$ for some $C$.
I'm struggling to understand how to check this. For example, supposedly $f(x) = 5x+3x^2$ is in $O(x)$ but not $O(x^2)$?
If I plot $f(x) = 5x + 3x^2$ and $g(x)=x$ I see that the first goes to infinity much quicker.
If I let $g(x) = Cx$, and plot $C=1,C=10, C=20, C=100$, it looks like it overcomes $f$ for $C>10$:

But, if you zoom out further, you can see that's not true:
So, I know it doesn't matter what $C$ is, but how can I show that there exists a $C$ to make the definition hold so that I can tell if $f$ is in $O(x^n)$?
If you go out far enough, and $C$ is large enough can't I make either $f$ or $g$ as close to the y-axis as I want?


The definition is not complete since the inequality is supposed to hold for all $x>k$ where $k$ is a constant. The idea behind the $\mathcal O$-notation is to provide an estimate (comparison) for large $x$. This makes life much easier for your example since $x < x^2 \ \forall\, x>1$, so that
So you might choose $C:=8$ and $k:=2$ here in order to see that $f\in\mathcal O(x^2)$. By the way, this statement can be generalized to hold for polynomials of degree $n$, i.e. if $f\in \mathcal P_n$ then $f\in\mathcal O(x^n)$.