Let $X_1,...,X_n$ be i.i.d. Bernoulli Variables with $$E(X_i)=p$$ for $i=1,...,n$ and $$X=\sum_{i=1}^n X_i \, .$$ Furthermore let $k \in \{0,1,...,n\}$ and consider the Expectation Value $$E_\pi=E\left( \pi ^k (1-\pi)^{n-k} \right)$$ with $\pi=X/n$. I rewrote the EV as $$E_\pi=\frac{1}{n^n} \sum_{i_1=1}^n \cdots \sum_{i_k=1}^n \sum_{i_{k+1}=1}^n \cdots \sum_{i_n=1}^n E\left( X_{i_1} \cdots X_{i_k} (1-X_{i_{k+1}}) \cdots (1-X_{i_{n}}) \right)$$ and figured, that it is essentially a combinatorical counting. First observe that if (at least) $i_a=i_b$ for some $a\in\{1,...,k\}$ and $b\in\{k+1,...,n\}$ there is no contribution, since $$P(X_{i_a}=1) \cdot 1 \cdot 0 + P(X_{i_a}=0) \cdot 0 \cdot 1 =0 \,.$$ Therefore we can factorize the expectation value, as long as we make sure for the indices $I_k=\{i_1,...,i_k\}$ and $I_n \setminus I_k=\{i_{k+1},...,i_n\}$ to run over complementary sets. Furthermore $$E\left( X_{i_1} \cdots X_{i_k} \right) = p^r \\ E\left( (1-X_{i_{k+1}}) \cdots (1-X_{i_{n}}) \right) = (1-p)^s$$ where $r=|I_k|$ is the size of $I_k$, that is the number of distinct indices and similarly $s=|I_n \setminus I_k|$. If $S_n=\{1,...,n\}$ is the set of values that the indices can take, there are $\binom{n}{r}$ distinct subsets ${\cal S}_r \subseteq S_n$ of size $r$, which lead to a term of $p^r$. For each subset we can think of the allowed sets of indices $I_k$ as the problem of putting $k$ distinct balls into $r$ distinct bins, with each bin having at least one ball, which is $r! S_{k,r}$ with the second Stirling number. We can thus write \begin{align} E_\pi &= \frac{1}{n^n} \sum_{r=0}^k \sum_{s=0}^{n-k} \sum_{{\cal S}_r\subseteq S_n} \sum_{{\cal S}_s\subseteq S_n\setminus {\cal S}_r} \sum_{I_k \in {\cal S}_r} \sum_{I_n \setminus I_k \in {\cal S}_s} p^r (1-p)^s \\ &= \frac{1}{n^n} \sum_{r=0}^k \sum_{s=0}^{n-k} \binom{n}{r} \binom{n-r}{s} r! S_{k,r} \, s! S_{n-k,s} \, p^r (1-p)^s \\ &= \frac{1}{n^n} \sum_{r=0}^k \sum_{s=0}^{n-k} \frac{n!}{(n-r-s)!} \, S_{k,r} \, S_{n-k,s} \, p^r (1-p)^s \tag{1} \end{align} which holds for the case $k=0$ and $k=n$ also. For example, for $k=n$ we would have $$E_\pi = \frac{1}{n^n} \sum_{r=0}^n \frac{n!}{(n-r)!} \, S_{n,r} \, p^r$$ which becomes $1$ for $p=1$ by the generating function for the Stirling numbers of second kind and amounts to the normalization.
My question is with regard to the more general form (1), if one may know further simplifications or even a closed form?
Another expression
$\pi$ takes the value $\;r/n\;$ with probability $ \displaystyle \binom{n}{r}p^r(1-p)^{n-r}$
The probabilities sum is 1: $ \displaystyle \sum_{r=0}^n \binom{n}{r}p^r(1-p)^{n-r} = 1$
With $n, k, p$ as constants, thus, by definition, the expected value is $$\sum_{r=0}^n \binom{n}{r}p^r(1-p)^{n-r} (r/n)^k(1-r/n)^{n-k} $$
That is equal to $$ \dfrac{1}{n^n} \sum_{r=0}^n \binom{n}{r}p^r(1-p)^{n-r} r^k(n-r)^{n-k} $$