I was reading this pdf and on page 6 proposition 8 states:
I don't really understand the steps that bring from
$$\alpha = \sum_{j=1}^n\sum_{i=1}^n a_{ij} x_{i} x_{j}$$
to its derivative
$$\frac{\partial \alpha}{\partial \bf{x}} = \sum_{j=1}^n a_{kj} x_J + \sum_{i=1}^n a_{ik}x_i$$
and then back to the final result:
$$\frac{\partial \alpha}{\partial \bf{x}} = \bf{x}^T A^T + \bf{x}^T A$$
Can someone please help me?

Consider a generic $1 \leq k \leq n$. We can write the following: $$\alpha = \sum_{j=1}^n\sum_{i=1}^n a_{ij} x_{i} x_{j} = \sum_{j=1}^n\left(\sum_{i=1, i \neq k}^n a_{ij} x_{i} x_{j} + a_{kj}x_{k}x_{j}\right) = \\ = \sum_{i=1, i \neq k}^n \sum_{j=1}^na_{ij} x_{i} x_{j} + \sum_{j=1}^na_{kj}x_{k}x_{j} =\\ = \sum_{i=1, i \neq k}^n \left(\sum_{j=1, j\neq k}^na_{ij} x_{i} x_{j} + a_{ik}x_i x_k\right) + \sum_{j=1, j \neq k}^na_{kj}x_{k}x_{j} + a_{kk}x_{k}^2 =\\ = \sum_{i=1, i \neq k}^n \sum_{j=1, j\neq k}^na_{ij} x_{i} x_{j} + \sum_{i=1, i\neq k}^na_{ik}x_i x_k + \sum_{j=1, j \neq k}^na_{kj}x_{k}x_{j} + a_{kk}x_{k}^2.\\ $$
Specifically, we have separated all the contributions depending on $x_k$ and those not depending on $x_k$. It is clear now that: $$\frac{\partial \alpha}{\partial x_k} = \sum_{i=1, i\neq k}^na_{ik}x_i + \sum_{j=1, j \neq k}^na_{kj}x_{j} + 2a_{kk}x_{k}.$$
We can further work on the last expression:
$$\frac{\partial \alpha}{\partial x_k} = \left[\sum_{i=1}^na_{ik}x_i - a_{kk}x_k\right] + \left[\sum_{j=1}^na_{kj}x_{j} - a_{kk}x_k\right] + 2a_{kk}x_{k} = \sum_{i=1}^na_{ik}x_i + \sum_{j=1}^na_{kj}x_{j}.$$
Now, we can try to obtain a vectorial representation. Let's pose:
where ${\bf f}$ and ${\bf g}$ are row vectors.
It is clear that:
and hence:
$$\frac{\partial \alpha}{\partial {\bf x}} = {\bf x}^\top {\bf A} + {\bf x}^\top {\bf A}^\top.$$