Assume that the graph $ G $ is equipped with node features $X \in \mathbb{R}^{n \times p_0} $ where $ x_i \in \mathbb{R}^{p_0} $ is the feature vector at node $ i = 1, \ldots, n = |V| $. Let $A$ denote the adjacency matrix for the graph let $\tilde{A}= A + I$ be the adjacency augmented matrix with self-loop, similarly $D$ be the degree matrix and $\tilde{D} = D +I$ diagonal degree matrix. Let's define the normalise augmented matrix $\hat{A} = \tilde{D}^{-\frac12}\tilde{A}\tilde{D}^{-\frac12}$.( I am not sure why the author use this normalise matrix)
We denote by $ h^{(\ell)}_i \in \mathbb{R}^{p_\ell}$ the representation of node $i$ at layer $\ell \geq 0$, with $ h^{(0)}_i = x_i $. Given a family of message functions $ \psi^{\ell}: \mathbb{R}^{p_\ell} \times \mathbb{R}^{p_\ell} \rightarrow \mathbb{R}^{p_{0\ell}} $ and update functions $ \phi^\ell : \mathbb{R}^{p_\ell} \times \mathbb{R}^{p^{'}_{\ell}} \to \mathbb{R}^{p_{\ell}+1} $, we can write the $(\ell + 1) $-st layer output of a generic MPNN as follows:
$$ h^{(\ell+1)}_i = \phi^\ell \left( h^{(\ell)}_i, \sum_{j=1}^{n} \hat{A}_{ij} \psi^\ell(h^{(\ell)}_i, h^{(\ell)}_j) \right) . $$
I am bit confused about the following proof related to the message-passing function
Assume an MPNN as in equation (1). Let $i, s \in V$ with $s \in S_{r+1}(i) $. If $ |\nabla \phi^\ell| \leq \alpha $ and $ |\nabla \psi^\ell| \leq \beta $ for $ 0 \leq \ell \leq r $, then $$ \mid\frac{\partial h^{(r+1)}_i}{\partial x_s} \mid \leq (\alpha \beta)^{r+1} \hat{A}^{r+1}_{is}. $$
The proof is based on use of Chain rule and product rule for derivatives. $$ \frac{\partial h^{(r+1)}_i}{\partial x_s} = \frac{\partial^1 \phi^r(\ldots)}{\partial x_s} h^{(r)}_i + \frac{\partial^2 \phi^r(\ldots)}{\partial x_s} \sum_{j=1}^{n} \hat{a}_{ij}^r \left( \frac{\partial^1 \psi^r(h^{(r)}_i, h^{(r)}_j)}{\partial x_s} h^{(r)}_i + \frac{\partial^2 \psi^r(h^{(r)}_i, h^{(r)}_j)}{\partial x_s} h^{(r)}_j \right). $$ The above chain rule as in the paper is not clear to me. And how it's use to make the bound if I can get some more explanation.