For fixed $a,b,c \in \mathbb{R}$ with $ac \neq 0$, it seems to me that one can find an increasing sequence of integers $\{\alpha_n\}$ such that the quantity $c \log \alpha_n$ becomes arbitrarily close to elements of the set
$$ A = \{ ak+b \,\colon k \in \mathbb{Z} \}. $$
First, is this true? If so, my question is: how good is the approximation?
For example, given any $\epsilon > 0$, is it possible to find infinitely-many integers $n$ such that, for some constant $C$,
$$ \textrm{dist}(c \log n,A) := \inf_{k \in \mathbb{Z}} |c \log n - ak-b| \leq C n^{-\epsilon}? $$
How about
$$ \textrm{dist}(c \log n,A) \leq C e^{-n}? $$
I am also interested in results which might say something like "There are at most finitely-many $n$ satisfying
$$ \textrm{dist}(c \log n,A) \leq C e^{-n^2} $$
for any positive constant $C$" to illustrate the "best-possible" nature of a less restrictive bound.
Motivation
I am trying to determine the behavior of a quantity like
$$ \left|\left(1-e^{i c \log n}\right)g(n)\right|^{1/n}, $$
where $g(n)$ is well-behaved. Until now I have simply been excluding all $n$ for which $1-e^{i c \log n}$ lies in some small fixed neighborhood of the origin, but doing this I lose infinitely-many $n$.
If, for example, it turns out that, for some positive constant $C$, there are only finitely-many $n$ satisfying
$$ \left| 1-e^{i(\theta + c \log n)} \right| \leq C n^{-1-\epsilon} \tag{1} $$
for any $\epsilon > 0$, then all but finitely-many $n$ satisfy
$$ C n^{-1} \leq \left| 1-e^{i(\theta + c \log n)} \right| \leq 2. $$
In that case we could exclude at most finitely-many $n$ to obtain the desirable property
$$ \left| 1-e^{i(\theta + c \log n)} \right|^{1/n} \to 1 $$
as $n \to \infty$.
Now, if we let
$$ B = \{2\pi k - \theta \,\colon k \in \mathbb{Z}\}, $$
then equation $(1)$ is equivalent to the existence of a positive constant $C_1$ such that only finitely-many $n$ satisfy
$$ \textrm{dist}(c \log n,B) \leq C_1 n^{-1-\epsilon} $$
for any $\epsilon > 0$.
Here's a more formal proof of the main argument presented in the comment.
First, to make things a little easier on ourselves, since $|ak+b-c\ln n|=c\cdot|a'k+b'-\ln n|$ where $a'=a/c$ and $b'=b/c$, we can pick $c=1$ without loss of generality. Secondly, we can assume that $b$ is the smallest positive number in $A$ and $a>0$ so that we're only concerned with the numbers $ak+b\in A$ for non-negative $k$.
To find $\ln n\approx ak+b$, we take $m_k=\lfloor e^{ak+b}\rfloor$. Then, we get $$\ln m_k \le ak+b < \ln(m_k+1)=\ln m_k+\ln\left(1+\frac{1}{m_k}\right) < \ln m_k+\frac{1}{m_k} $$ so since $ak+b$ is inside an interval of length less than $1/m_k$, the distance to either end of the interval, i.e. either $\ln m_k$ or $\ln(m_k+1)$, must be $\epsilon_k<1/2m_k$. If we let $n_k$ be either $m_k$ or $m_k+1$ depending on which logarithm gave the best estimate, we get $\epsilon_k=|ak+b-\ln n_k|<\frac{1}{2(n_k-1)}$.
Thus, for any $C>1/2$, we'll have $\epsilon_k<C/n_k$ for all sufficiently large $k$, providing an infinite number of solutions.
The second claim I made, I'll have to give a little more thought on how to formalise, although I can give the main idea more clearly. Anyway, I think it was wrong as stated: I'd expect an infinite number of solutions with $\epsilon<C/nk$ which would translate to $\epsilon<C/(n\ln n)$ (different $C$). I'd mixed up the $k$ and $n$ when I was thinking this through.
For generic $a$ and $b$, we would expect $ak+b$ to lie at some seemingly random place inside the interval $[\ln m_k,\ln(m_k+1))$. If we let $l_k=\ln(1+1/m_k)/2$ be half the length of the interval, we should then expect that the distance $\epsilon_k$ from the closes end should be uniformly distributed on $[1,l_k]$. Stated differently, we would expect the $\epsilon_k/l_k$ to be uniformly distributed on $[0,1]$.
If we take a sequence $p_k\in(0,1]$, the likelihood that $\epsilon_k/l_k<p_k$ is $p_k$. The expected number of $k$ for which $\epsilon_k<l_kp_k$ is then $\sum_k p_k$. If we let $p_k=1/k$, the expected number of solutions where $\epsilon_k<l_kp_k<C/(n\ln n)$ is thus infinite, but by a small margin: $C/[n(\ln n)^{1+\epsilon}]$ should be expected to give a finite number of solutions.
To get a better understanding of the randomness perspective, let's rewrite the inequality $$ m_k\le\beta\alpha^k<m_k+1 \text{ where }\alpha=e^a>1, \beta=e^b\ge1 $$ and note that $\delta_k=\text{Dist}(\beta\alpha^k,\mathbb{Z})\approx\epsilon_km_k$.
A case where the randomness argument fails is when $\alpha$ is an odd natural number and $\beta=3/2$. This makes $\delta_k=1/2$ for all $k$. I suspect numerous similar examples can be made for algebraic $\alpha$.
I would hypothesise the set of $(\alpha,\beta)$ for which the number of arbitrarily small $\epsilon_k$ (or $\delta_k$) is finite should have measure zero.