The generally agreed definition of the Big Oh notation (afaik) is as follows:
The function $f(n)$ is $O(g(n))$ if there exists constants $c$ and $n_0$ such that for all $n \ge n_0$, $f(n) \le c g(n)$.
Why can't we replace $n_0$ by a constant value, say, $1$? In this case, I can define $c'$ to be: $\max\left(c, \max_{1 \le i \le n}\frac{f(i)}{g(i)} \right)$, and the inequality should hold, right?
if $g(1)=0$ you get issues. It's simply more convenient to ignore small $n.$