We have to prove that if $lim_{(n\rightarrow\infty)} \frac{f(n)}{g(n)} = 0$, then $f(n)$ is $O(g(n))$ but $g(n)$ is not $O(f(n))$.
I understand that because the limit is 0, then it can be said that $f(n) << g(n)$ (asymptotically smaller) but how would I go about proving this properly?
The big O notation means that you can construct an equation from a certain set, that would grow as fast or faster than the function you are comparing.
So O(g(n)) means the set of functions that look like a*g(n), where "a" can be anything, especially a large enough constant.
So for instance, $f(n)= 99,998n^3+1000n$ is considered $O(n^3)$ because I can take the "$n^3$" part and pick a big enough constant so that it bounds the function.
One function that you could arbitrarily pick is $z(n)=99,999n^3$ and this function will be higher than $f(n)$. Of course, you could also pick $z(n)= n^{999}$, which would also bound it, but that's not the tightest, lowest order function that you can find. Therefore, $z(n)= n^{999}$ is not considered a big O bound for $f(n)$.
One way to prove that a function is smaller than another is through division and take the limit. If the answer tends to infinity, then you know the numerator grows faster than the denominator. If the answer tends to 0, then you know that whatever is in the denominator bounds the function on top.
If you were given $g(n)$, then if you can find a function $f(n)$ and a constant "a"
$$\frac{g(n)}{a*f(n)}= 0$$
as n approaches infinity, then you just proved that $g(n)$ belongs to $O(f(n))$