This is a homework problem, but it's taking me longer than expected to solve so I'd appreciate a direction.
The definition of O(g)=f states that it has to be from a specific point, $n>n_0$, meaning I have to find two functions, where the inequality:
$$f(n)<c\cdot g(n)$$
doesn't hold true, or prove this is true by induction.
I'm thinking this isn't true, because since c has to be a constant, not something that changes, so all it will take is to find 2 functions that can disprove this statement.
EDIT: $f,g: \mathbb N \to \mathbb R ^+$
Hint
What happens if $g(0)=0$ but $f(0) > 0$ ?
Edit
If $f$ and $g$ from $\Bbb N$ to $\Bbb R^+$ then it is true:
As $f=O(g)$ there exists $c_1>0$ and $n_0$ such that $\forall n > n_0$, $f(n) < c_1 g(n)$.
With $$ c_2=\max_{n \leq n_0} \frac{f(n)}{g(n)}$$ you have $\forall n \leq n_0$, $f(n) < c_2 g(n)$.
Then you can verify that with $c=\max(c_1,c_2)$, you have $\forall n $, $f(n) < c g(n)$.