Conjecture.
Let $a$ and $b \neq \pm 1$ be two integers.
Assume that the following system of congruences holds:
$$\left\{\begin{array}{l}a-1 \equiv 0\mod{b-1}\\a^2-1 \equiv 0\mod{b^2-1}\\ \ldots\\a^n-1 \equiv 0\mod{b^n-1}\end{array}\right. (\star)$$
$\bf{a.}$ Is it true that $\frac{(b-a)(b^2-a)\cdot\ldots\cdot(b^n-a)}{(b-1)(b^2-1)\cdot\ldots\cdot(b^n-1)}\in\mathbb{Z}?$ Answer to it is "no" and it's because the comment of bof (i.e. $a=5, b=3, n=2$).
$\bf{b.}$ Is it true that $\frac{(b-a)(b^2-a)\cdot\ldots\cdot(b^n-a)}{(b-1)(b^2-1)\cdot\ldots\cdot(b^n-1)} = \frac{u}{v}$, where $v$ is a power of $2$? And what is the bound of $\log_2 v$ in terms of $n$ (is it true that $v\leq 2^n$)?
My ideas.
I think (if $(\star)\Rightarrow \bf{b}$ is true) that the next lemma can be helpful
Lemma. Consider $2n$ positive integers $a_1, a_2,\ldots, a_n$ and $b_1, b_2,\ldots, b_n$. Let it given that for any prime $p\not= 2$ and any positive integer $k$, $\#\{i:a_i\equiv 0 \mod{p^k}\}\geq \#\{j: b_j\equiv 0\mod{p^k}\}$. Then we have $\frac{\prod_{i=1}^n a_i}{\prod_{j=1}^n b_j}=\frac{u}{v}$, where $v$ is a power of $2$.
So I am really interested in any proof of the Conjecture, or in a counterexample to it.
Your lemma may indeed be applied.
Notations. For a prime $p$ and non-zero integer $x$ let $|x|_p$ denote the maximal non-negative integer $k$ for which $p^k|x$; and $|0|_p=+\infty$. Also denote $$f(p^t)=\#\left\{i\le n: |b^i-1|_p\geqslant t\right\},\\ g(p^t)=\#\left\{i\le n: |b^i-a|_p\geqslant t\right\}.$$ Then I claim that $\sum_t g(p^t)\geqslant \sum_t f(p^t)$ for odd $p$ and $|n!|_2+\sum_t g(2^t)\geqslant \sum_t f(2^t)$. In other words, the expression $$ 2^{[n/2]+[n/4]+[n/8]+\dots}\prod_{k=1}^n \frac{b^k-a}{b^k-1} $$ is integral.
Fix a prime $p$. Choose minimal $s\in \{1,2,\dots,n\}$ for which $b^s\equiv 1 \pmod p$ (if such $s$ does not exist, there is nothing to prove). Note that $p$ does not divide $s$, else $b^{s/p}\equiv b^s\equiv 1\pmod p$ and $s$ is not minimal. Next, by minimality the residues $1,b,b^2,\dots,b^{s-1}$ are distinct modulo $p$, hence they all are the $s$ distinct roots of the polynomial $x^s-1$ over $\mathbb{F}_p$. Since $p$ divides $a^s-1$, $a$ is a root of the same polynomial and we get $a\equiv b^r\pmod p$ for some $r\in \{0,1,\dots,s-1\}$. Denote $a=b^r+P$, where $p$ divides $P$.
Lemma 1. $|P|_p\geqslant |b^s-1|_p$.
Proof of Lemma 1. Assume the contrary, denote $|P|_p=d\geqslant 1$ and assume that $b^s-1$ is divisible by $p^{d+1}$. Then $a^s-1=(b^r+P)^s-1\equiv sb^r P \pmod {p^{d+1}}$, since both $b^{rs}-1$ and $P^2$ are divisible by $p^{d+1}$. But neither $b$ nor $s$ are divisible by $p$ and we conclude that $|a^s-1|=d<|b^s-1|$, a contradiction.
We need also the following lifting
Lemma 2. Assume that $|b^S-1|_p=\beta\geqslant 1$, where $1\le S\le n$, and $a\equiv b^R \pmod{p^\beta}$ for certain $R$, where $0\le R<S$. Then we may find $\tau\in \{0,1,\dots,p-1\}$ such that $a\equiv b^{R+S\tau} \pmod {p^{\beta+1}}$.
Proof of Lemma 2. We have $b^{R+S\tau}=b^R(1+(b^S-1))^\tau\equiv b^R+\tau b^R(b^S-1)\pmod{p^{\beta+1}}$, and $\tau b^r(b^s-1)$ modulo $p^{\beta+1}$ takes all residues divisible by $p^\beta$ exactly once when $\tau\in \{0,1,\dots,p-1\}$. In particular, it takes the residue $a-b^R$ as needed.
Now denote $|b^s-1|_p=\alpha\geqslant 1$. Note that if $p$ divides $b^x-1$, we have $s|x$ and $|b^s-1|_x\geqslant \alpha$. In the next lifting lemma (referred sometimes as Hensel lemma) there is a difference between odd and even $p$.
Lemma 3. If $p$ is odd and $|b^S-1|_p>0$ for some $S>0$, then $|b^{kS}-1|_p=|b^S-1|_p+|k|_p$. The same holds if $p=2$ and $|b^S-1|_p>1$.
Proof of Lemma 3. It suffices to consider the case of prime $k$, the general case case goes by induction on $k$. Denote $d=|b^S-1|_p$. We have $b^{kS}-1=(1+(b^S-1))^k-1=k(b^S-1)+\binom{k}2(b^S-1)^2+\dots$. If $k$ is not divisible by $p$, we have $|k(b^S-1)|_p=d$ and other terms are divisible by $p^{d+1}$, thus $|b^{kS}-1|_p=d$. If $k=p$, we get $|k(b^S-1)|_p=d+1$. If either $p$ is odd, or $p=2$ and $d\geqslant 2$, all other terms are divisible by $p^{d+2}$ and we conclude that $|b^{kS}-1|_p=d+1$.
Lemma 3 implies that, if either $p$ is odd or $\alpha>1$, for given $t\geqslant 0$ the minimal $x$ for which $p^{\alpha+t}$ divides $b^x-1$ equals $S=sp^{t}$, and $f(p^{\alpha+t})=[n/S].$ Also $f(p^u)=[n/s]$ for all $u\leqslant \alpha$.
Consecutively applying Lemma 2 we find $R<S$ such that $a\equiv b^R\pmod{p^{\alpha+t}}$. Then at least $[n/S]$ indices $i\le n$ satisfy $a\equiv b^i\pmod {p^{\alpha+t}}$ (namely, $i=R,R+S,\dots,R+([n/S]-1)S$.) This proves the odd case.
It remains to understand what happens for $p=2$ (and odd $b$). If $b-1$ is divisible by 4, the very same argument works due to Lemma 3. So, it remains to consider the case $b=4k+3$. Denote $\alpha=|b+1|_2\geqslant 2$. Since $b^2-1$ divides $a^2-1$, we get $|a-1|_2+|a+1|_2\geqslant \alpha$, therefore $\alpha$ is congruent to $\pm 1$ modulo $2^{\alpha-1}$. For $2\leqslant t\leqslant \alpha-1$ the powers of $b$ modulo $2^t$ are $-1,1,-1,1,\dots$, thus $g(2^t)\geqslant [n/2]=f(2^t)$. Also note that $g(2)=f(2)=n$. As for $t\geqslant \alpha$, it may appear that $g(2^t)=0$ (say, it happens for $a=5$, $b=3$), but for all such $t$ we get $\sum_{t\geqslant \alpha} f(2^t)=[n/2]+[n/4]+\dots$. This finishes the even case.