By the definition of Big $\mathcal O$ we have :
Let $f$ and $g$ be functions from $(N \lor R) \to R.$ We say that $f(x)$ is in $\mathcal O(g(x))$ iff there are constants $C$, $k$ such that
$$f(x) \leq Cg(x) , \,\, \forall x > k$$
Attempt:
By assuming that $g(x)\ne 0,\,\, \forall x \in R$ , we can divide with $g(x)$ both sides and obtain :
$$\frac{f(x)}{g(x)} \leq C, \space \space \forall x > k$$
From here we can find $\lim_{x\to \infty}$ on both sides and we have that:
$$\lim_{x\to \infty} \frac{f(x)}{g(x)} \leq \lim_{x\to \infty} C ,\,\, \forall x > k,$$
which is same as :
$$\lim_{x\to \infty} \frac{f(x)}{g(x)} \leq C, \,\,\forall x > k$$
From here , if the limit exists then we can conclude that $f(x)$ is $O(g(x))$ , because we have found constant $C$ and we can easily find constant $k$.
If the limit does not exist, then we can tell that $f(x)$ grows faster than $g(x)$ , and we are unable to find constant $C$ such that the inequality will be true $\forall x > k$.
My question is , is this method alright or am I making mistake somewhere ?
Your argumentation looks valid. I'm not sure if you are aware that this is known, see Wikipedia and ProofWiki and for a most rigorous formal proof e.g. in Mizar look here, here and here.