I know that a decimal number with $d$ digits requires at most $\lceil d\log_210 \rceil$ bits to be written in binary representation.
I'm trying to prove the general claim:
For every $b,c>1$: a number with $d$ digits in basis $b$ requires at most $\lceil d\log_cb \rceil$ bits to be written in basis $c$ representation.
I tried to use to claim that a number $N$ with $d$ digits in any basis upholds that $b^{d-1} \leq N \leq b^d$, and use logarithm rules but got stuck.
Any help appreciated.
What you need is that the number of base $c$ digits in $N$ is $\lceil \log_c (N+1) \rceil$, which comes from the same consideration as $b^{d-1} \leq N \lt b^d$ (note the change to the upper less than sign). Written in base $c$ it has $\lceil \log_c (N+1) \rceil \le \lceil \log_c b^d \rceil=\lceil d\log_c b \rceil$ digits