Let $X \sim \text{Binomial}(n, p)$ be the sum of $n$ Bernoulli($p$) random variables.
What is the value of $E(X/(np))^k$, where $k$ is a large integer, as $n$ grows large?
From calculations the first values are
E1 = 1
E(X/np) = 1
E(X/np)^2 = 1 + r/n
E(X/np)^3 = 1 + 3r/n + ((-1+p)(-1+2p))/(np)^2
E(X/np)^4 = 1 + 6r/n + O[1/n]^2
E(X/np)^5 = 1 + 10r/n + O[1/n]^2
E(X/np)^6 = 1 + 15r/n + O[1/n]^2
E(X/np)^7 = 1 + 21r/n + O[1/n]^2
...
where $r = (1-p)/p$
So my guess would be that one could obtain a result like $E(X/(np))^k = 1 + O(k^2/n)$, but I'm not sure how I'd proceed.
For sums of $\{+1, -1\}$ random variables, I'm aware that we can bound the moments by replacing each one with a normal random variable with the same variance. However converting this result to the non central case doesn't seem obvious?
The right universal bound is $$E\left[\left(\frac{X}{np}\right)^k\right] \le \left(1+\frac{k}{2np}\right)^k.$$ For $k = O(\sqrt{np})$ this explains the $1+O(k^2/(np))$ behaviour.
I wrote a note detailing the answer here https://thomasahle.com/#paper-bi-moments .
Using the Moment Generating Function
Let $$m(t) = E[e^{t X}] = (1-p+pe^t)^n \le \exp(\mu(e^t-1)),$$ where $\mu=np$. This last bound is the moment generation of a Poisson random variable, so the following bound will hold in that case as well.
Now $E[X^k] \le m(t)t^{-k}k!$ follows easily from the expansion $E[e^{tX}]=\sum_i E[X^i]t^i/i!$. However, we will need the slightly stronger bound: $$E[X^k] \le m(t)(k/(et))^k.$$ This follows from the basic inequality $1+x\le e^x$, where we substitute $tx/k-1$ for $x$ to get $tx/k \le e^{tx/k-1} \implies x^k \le e^{tx}(k/(et))^k$. Taking expectations we get the intended bound.
We define $B=k/\mu$ like above, and take $t$ such that $t e^t=B$. (This $t$ is also known as $W(B)$, using the Lambert function.)
We then have $$\begin{align} E[(X/\mu)^k] &\le m(t)(\mu t)^{-k}(k/e)^k \\&\le \exp(\mu(e^t-1))\big(\frac{k}{e\mu t}\big)^k \\&= \exp(\mu(B/t-1)+tk-k) \\&= \exp(k f(B)) , \end{align}$$ where $f(B) = 1/t+t-1-1/B$. We can bound $\exp(f(x))$ by $B/\log(B+1) \le 1+B/2 \le e^{B/2}$ using standard bounds on the Lambert function.
This finally gives the bounds $$ \|X/\mu\|_k \le \frac{B}{\log(B+1)} \le e^{B/2}. $$ Taking $k$ powers this means $$ \text{and}\quad E[(X/\mu)^k] \le \exp(k^2/(2np)), $$ where $e^{k^2/(2np)} = 1+O(k^2/(np))$ when $k = O(\sqrt{np})$.
Old attempt using norms:
I found the following interesting non-asymptotic bound:
Let $\|X\|_k = E[|X|^k]^{1/k}$ and use the triangle inequality $$\|X\|_k \le np + \|X-np\|_k.$$
To bound the second term, we use Latala's formula for sums of iid. random variables: $$\|X\|_k = \|\sum_i X_i - E[X_i]\|_k \lesssim \inf_{\max\{2,\frac kn\}\le s\le k} \frac ks \left(\frac nk\right)^{1/s} \|X_i - E[X_i]\|_s$$
For Bernoulli random variables with probability $p$: $\|B-p\|_k = \left(p (1-p)^k+(1-p) p^k\right)^{1/k} \le p^{1/k}$.
Let $\beta = \frac{k}{np}$. We assume $k\le n$ and $np\ge 1$. Optimizing in Latala's we have
$$ \|X\|_k \lesssim \begin{cases} \sqrt{n p k} & \text{if $\beta < e$} \\ \frac{k}{\log\beta} & \text{if $\beta\ge e$}. \end{cases} $$
Putting it all together we have
$$ E\left[\left(\frac{X}{np}\right)^k\right] \le \begin{cases} (1 + C\sqrt{\frac{k}{np}})^k & \text{if $\beta < e$} \\ (1 + C\frac{k}{n p\log\beta})^k & \text{if $\beta\ge e$}. \end{cases} $$
In particular for $k = O((n p)^{1/3})$ we get $E(\frac{X}{np})^k=1 + O(k^3/(n p))$. This is not quite as good as the conjectured bound, but at least it is rigorous and works even when $p$ is also allowed to depend on $n$.
It also shows that the moments of $\frac{X}{np}-1$ are within a constant of the equivalent Gaussian, as long as $k < np$.
Likely one has to avoid the triangle inequality to show the stronger conjecture.