Let $G\in \mathcal{G}(n,p)$ be a Erdos-Renyi random graph. Let $X_k$ be a random variable that the number of cycles in $G$ that are length $k$. How to get the following inequality:
$$ E[X_k]\le n^k p^k? $$
Hence, the number of cycles in $G$ that are length at most $k$, the expectation: $$ E[X]=\sum_{i=1}^k E[X_i]\le \sum_{i=1}^k n^i p^i. $$
The definition of the expectation is that $$ E[X]=\sum_{G\in \mathcal{G}(n,p)} X(G)P(\{G\}) $$ Here for such a graph $G$, we know that $P(\{G\})=p^k$ because one cycle with length $k$.
I try to give proof as follows.
For every $k-$cycle $C$ with vertices in $V=\{0,\dots, n-1\}$. Let $X_c$ be the indicator random variable of $C$ that $X_c=1$ if $C\subset G$ otherwise 0. So $$ E[X_c]=P(C\subset G)=p^k. $$
But why the number of such indicator functions is at most $n^k$?
A cycle of length $k$ is specified by an ordered list $(v_1,v_2,\dots,v_k)$ of vertices. There are at most $n$ choices for each of the vertices in the list, and $k$ entries in the list, so at most $n^k$ ways to choose such a list.