For any given pairs of positive integers $a$ and $b$, is it possible that the first digit of $a^n$ never matches the first digit of $b^n$ for any positive integer $n$?
(If $a=2$ and $b=5$ the only possible matching first digit is "3".)
Edit: I meant to say $a=2$ in the parenthetical comment above. I had previously written $a=3$, which does not work as several comments indicate.
I was also thinking along the lines of Ross Millikan's comment. The first digit (in base 10) of $a^n$ is determined by the fractional part of $\log_{10}(a)$; specifically, the interval $[0,1]$ is partitioned into buckets $[0,\log_{10}(2)), [\log_{10}(2), \log_{10}(3))$, etc. The first digit of $a^n$ and $b^n$ would be the same of $n\log(a)$ and $n\log(b)$ fall into the same bucket, which would follow from the fractional part of $m\log(a/b)$ being smaller than the smallest bucket for some $m$, which means either $m$ or $m+1$ would do the trick.
Now $\log_{10}(a/b)$ is irrational except if $a/b$ is a power of 10, but in the latter case the result is obvious. For an irrational quantity, the fractional parts are dense in the unit interval and the result follows from the considerations above.