According to this paper, the local spectral moments $\mu_k(i)$ are defined as the $i$-th diagonal entry of the $k$-th power of the adjacency matrix $A$ of a simple graph $G = (V, E)$, i.e.,
$$ \mu_k(i) := \left( A^k \right)_{ii}.$$
Later in the paper, the authors state that
$$ \mu_k(i) \leq \lambda^k $$
for every vertex $i \in \{1, \dots, n\}$ and for every $k \in \mathbb N$ where $\lambda$ is the main eigenvalue of $A$.
I would like to know where this inequality comes from. I could imagine the Perron-Frobenius theorem playing a role.
Assuming that the graph is undirected, its adjacency matrix $\bf A$ is symmetric and, thus, has real eigenvalues and a spectral decomposition ${\bf A} = {\bf Q} {\bf \Lambda} {\bf Q}^\top$. Hence,
$$ \mu_k (i) := {\bf e}_i^\top {\bf A}^k {\bf e}_i = {\bf e}_i^\top {\bf Q} {\bf \Lambda}^k {\bf Q}^\top {\bf e}_i \leq \left( \lambda_{\max} ({\bf A}) \right)^k \underbrace{{\bf e}_i^\top {\bf Q} \, {\bf Q}^\top {\bf e}_i}_{= {\bf e}_i^\top {\bf e}_i = 1} = \left( \lambda_{\max} ({\bf A}) \right)^k$$