Suppose $X\sim \operatorname{Bin}(n,p_n)$ (binomial distribution) with $p_n\to 0$ and $np_n\to \infty$. Then $$\limsup_{n\to\infty} \frac{\mathbb{E}\left[(X-np_n)^{2k}\right]}{(np_n)^k}<\infty, \quad \mbox{ for each }k\in \mathbb{N}.$$ For those who are not familiar with binomial distribution:
I am not entirely sure if it is true. If you see here, the above claim is indeed true for $k=1,2,3$.
My try: Since the expectation of falling factorials: $$\mathbb{E}[(X)_i]:=\mathbb{E}[X(X-1)\ldots(X-i+1)]=(n)_i p_n^i$$ has a nice closed form expression, I tried to use \begin{align}\mathbb{E}\left[(X-np)^{2k}\right] & =\sum_{j=0}^{2k} \binom{2k}{j}\mathbb{E}[X^j](-np)^{2k-j} \\ & =\sum_{j=0}^{2k} \binom{2k}{j}(-np)^{2k-j}\sum_{i=0}^j {j \brace i}\mathbb{E}[(X)_i] \\ & =\sum_{j=0}^{2k} \binom{2k}{j}(-np)^{2k-j}\sum_{i=0}^j {j \brace i}(n)_ip_n^i \end{align} But then I am not sure how to proceed or simplify the expression. Here $j \brace i$ denotes the Stirling numbers of the second kind.
Let me know if you have any thoughts on it. Any form of help or suggestion is much appreciated.
I figured it out myself. Let me answer here in case anyone interested.
Suppose $X=\sum_{i=1}^n Y_i$ where $Y_i$ are independent $\operatorname{Ber}(p_n)$ random variables. Let me also write $p$ instead of $p_n$. So, we may write $$\mathbb{E}[(X-np)^{2k}]=\mathbb{E}\left[\left(\sum_{i=1}^n (Y_i-p)\right)^{2k}\right]$$ Once we expand the power inside we will get terms of the form $$\mathbb{E}\left[\prod_{j=1}^r (Y_{i_j}-p)^{m_j}\right] \stackrel{ind}{=} \prod_{j=1}^r \mathbb{E}\left[(Y_{i_j}-p)^{m_j}\right] \qquad (1)$$ where $(m_1,\ldots,m_r)$ form a partition of $2k$ and $(i_1,\ldots,i_r)$ are distinct elements from $\{1,2,\ldots,n\}$. The number of ways one can choose $(i_1,\ldots,i_r)$ from $\{1,2,\ldots,n\}$ is of the order $O(n^r)$. Hence there are $O(n^r)$ many terms of the above form. Now, the leading order of the central moments of Bernoulli distribution is easy to compute. Indeed, $$\mathbb{E}\left[(Y_{i_j}-p)^{m_j}\right]= \mathbb{E}[Y_{i_j}^{m_j}]+\sum_{t=1}^{m_j} \binom{m_j}{t}\mathbb{E}[Y_{i_j}^{m_j-t}](-p)^t.$$ As $p\to 0$, the leading behavior is given by the first term which is nothing but $p$. Thus the leading behavior of r.h.s. of $(1)$ is $p^r$.
So we figured out the contribution of (1) as well as the number of terms that looks like (1) in the expansion. Now the key observation is: if any of the $m_j$ is exactly $1$, then $(1)$ is exactly zero. So, we may assume all the $m_i\ge 2$ for all $j$. But that means the length of the partition can be at most $k$, i.e., $r\le k$. So, for each fixed partition $(m_1,\ldots,m_r)$ with $m_j\ge 2$, the total contribution is atmost $O((np)^k)$ (as $np\to \infty$). Now the number of such partitions depends only $k$ (has nothing to do with $n$ or $p$). Hence this shows $$\mathbb{E}[(X-np)^{2k}]=O((np)^k).$$