I am looking for a prettier proof to show that the scaled moments of a simple random walk can be bounded from above by the moments of a standard normal distribution.
More precisely, let $X_1, \ldots, X_n$ be independant Rademacher-distributed random variables (i.e. $1$, $-1$ with prob. 1/2). Then, $$ E[\big(\frac{1}{\sqrt{n}} \sum_{i=1}^n X_i \big)^{2k}] \le (2k-1)!!.$$
The basic idea of my current proof relies on combinatorical arguments and goes like this: The expectation is equal to $$ n^{-k} \sum_{i_{1},\ldots,i_{2k}} E[X_{i_1} \cdots X_{i_{2k}}],$$ where I sum over all combinations with $i_1,\ldots, i_{2k} \in \{1,\ldots,n\}$. Since $E[X_1]=0$ and $E[X_1]=1$ and all r.v. are indepedent, everytime a r.v. occurs in an odd number of times the expectation is zero. By counting all the combinations where the random variables appear in pairs the expectation is equal to one and a comparison of the number of ways I can choose the indices show the statement.
For example in $k=2$ $$ n^{-2} \sum_{i_{1},\ldots,i_{4}} E[X_{i_1} \cdots X_{i_{4}}] = n^{-2} [n E[X_1^4] + 3 n(n-1)E[X_1^2X_2^2]] = 3-2 \frac{1}{n} \le 3$$ and all other part disappear because the expectation is equal to zero.
I am thankful for every suggestion.
Edit: Another approach is to consider the moment generating function and calculate the derivatives: $$E[\exp(tS_n)] = \cosh(t)^n.$$ Only the even moments are of interest (since odd moments are zero). The derivative of the moment generating function evaluated at t=0 has the advantage that all terms which include $\sinh(t)$ disappear but it is quiet unattractive/complicated to derive a general form of the $2k$-th derivative.