The equation for the number of digits required to express some number (N) given some base (b) is the following
$\lceil{\log_{b}(N+1)}\rceil$
If we want to see the difference in number of digits required for varying bases then then our general formula would be
$\lceil{\log_{b_{1}}(N+1)}\rceil - \lceil{\log_{b_{2}}(N+1)}\rceil$
But I'm not sure how to solve this for some number in terms of N since they have different bases.
This is for a Computer Science algorithms class so approximate answers are fine.
To convert from one logarithmic base to another we have the equation
$\log_b(N) = \log_a(N)/\log_a(b)$
Thus the size of the integer is always different by log_a(b) which is a constant.