Consider the following Binomial distribution, $$ N \sim \operatorname{Binomial}(e^{nA}, e^{-nB}), $$ In other words, it's a Binomial distribution with exponentially many terms and exponentially small success probability. We also know that $A > B$. What I'm struggling to show is that for $s>0$, $$ \mathbb{E}\left[N^s\right] \doteq e^{ns(A-B)}, $$ where the dot equality notation $a_n \doteq b_n$ denotes equality in the exponential scale for two positive sequences $\left\{a_n\right\}$ and $\left\{b_n\right\}$, implying that the limit as $n \rightarrow \infty$ of $\frac{1}{n} \log \frac{a_n}{b_n}$ tends to zero.
Does anyone have any idea how to prove this? I attempted to demonstrate that the probability of observing the expectation tends to one, but I was unsuccessful.
For $X \sim \operatorname{Binomial}(m, p)$, from the Markov inequality we have for $s>0$
$$\frac{\mathbb{E}\left[X^s\right]}{(np)^s} \ge \mathbb{P}(X^s\ge (mp)^s)=\mathbb{P}(X\ge mp) \ge \mathbb{P}(X\ge \text{med}(X)) \ge \frac{1}{2}.$$
Moreover, we know that $X$ has a sub-Poisson distribution, that is:
$$\mathbb{M}_X(t)=(q+pe^t)^m \le \exp (mp(e^t-1)).$$
This implies that for every $s>0$ (see Corollary 1 of this paper or Higher moments section in the Wikipedia link for more details):
$$\mathbb{E}\left[X^s\right] \le (mp)^s \exp \left (\frac{s^2}{2mp} \right ).$$
Hence, we get the bounds:
$$\log \frac{ 1}{2} \le \log \frac{ \mathbb{E}\left[X^s\right]}{(mp)^s} \le \frac{s^2}{2mp},$$
which implies the desired result after setting $p=e^{-nB}, m= e^{nA}$ with $A>B>0$, dividing by $n$, and taking the limit as $n \to \infty $.