I'm unsure about how to solve this exercise, any help on the steps I took or alternative ways are appreciated.
Intuitively this should not be the case, because the constant factor $c$ does have no weigh in the exponential term as n approaches infinity.
$Definition: \mathcal{O}(g) = \{f \in \mathcal{F}| \exists c >0 \exists{n_0}\in\mathbb{N}\forall n\ge n_0:f(n)\le cg(n)\}$
So far I've rewritten the equation a bit to get to "simpler" terms:
$$ a^n = c\cdot b^n \Leftrightarrow $$ $$log_b(a^n) = log_b(c \cdot b^n) \Leftrightarrow $$ $$log_b(a^n) = log_b(c)+log_b(b^n) \Leftrightarrow $$ $$log_b(a^n) = log_b(c)+n \Leftrightarrow $$ $$\frac{log_a(a^n)}{log_a(b)} = log_b(c)+n \Leftrightarrow $$ $$\frac{n}{log_a(b)} = log_b(c)+n$$
On the left hand side we have n that is divided by a constant factor $> 1$ and on to the right hand side a constant factor $> 1$ is added, but its impact pales in comparision to the quotient term, as n gets large. Therefore we can't find a c and $\mathcal{O}(a^n) \neq \mathcal{O}(b^n)$
When you take equality $O(a^n)=O(b^n)$, then, as you consider left and right sides as sets, the equality we should understand as between sets, so as $$O(a^n) \subset O(b^n) \land O(b^n) \subset O(a^n)$$ To be specific let's assume, that $1<a<b$. We need take any $g \in O(a^n)$ and show that it holds $g \in O(b^n)$. This is clear, because of $a<b$.
But for other side, i.e. $O(b^n) \subset O(a^n)$ it is not so: obviously $b^n \in O(b^n)$. If we admit, that $b^n \in O(a^n)$, then it means existence of such $C>0,N \in \mathbb{N}$, that for $\forall n>N$ $b^n \leqslant C a^n\Leftrightarrow \left(\frac{b}{a}\right)^n \leqslant C$, which is impossible, as $\left(\frac{b}{a}\right)^n\to \infty.$
So $1<a<b$ implies $O(a^n) \subset O(b^n)$.