I am going through PCA notes and I am stumped by the last line in the term expansion (see image below). How did we get $-2\sum_{j=1}^N\sum_{i=1}^k\alpha_{ji}x_j^Tv_i$ from the previous two middle terms?
EDIT: In the note, we are trying to minimize the projection error of the j-th vector $x_j$ onto the subspace $W$. $W$ has the basis vectors $v_1, v_2, ..., v_k$ which are all orthonormal. The goal is to figure out the optimal set of $\alpha$ and $v$ (as it is with PCA).
I understand that this might be something incredibly trivial but I can't figure it out. TIA.

With the help of Jean-Claude's comment, I was able to figure it out. This is how the expression should be from the third line in the term expansion:
$$\sum_{j=1}^{N} \{ (x_j^T - \sum_{i=1}^{k}\alpha_{ji}v_i^T)(x_j - \sum_{i=1}^{k}\alpha_{ji}v_i) \}$$
$$=\sum_{j=1}^{N} \{ x_j^Tx_j - x_j^T\sum_{i=1}^{k}\alpha_{ji}v_i - (\sum_{i=1}^{k}\alpha_{ji}v_i^T)x_j + \sum_{i=1}^{k}\alpha_{ji}v_i^T\sum_{i=1}^{k}\alpha_{ji}v_i \}$$
$\because$ $v_i^Tx_j$ is a dot product and hence commutative, i.e. $v_i^Tx_j = x_j^Tv_i$. So, we can write the above term as:
$$=\sum_{j=1}^{N} \{ || x_j ||^2 - x_j^T\sum_{i=1}^{k}\alpha_{ji}v_i - \sum_{i=1}^{k}\alpha_{ji}x_j^Tv_i + \sum_{i=1}^{k}\alpha_{ji}^2 \}$$
$$=\sum_{j=1}^{N} \{ || x_j ||^2 - x_j^T\sum_{i=1}^{k}\alpha_{ji}v_i - x_j^T\sum_{i=1}^{k}\alpha_{ji}v_i + \sum_{i=1}^{k}\alpha_{ji}^2 \}$$
$$=\sum_{j=1}^{N} \{ || x_j ||^2 - 2x_j^T\sum_{i=1}^{k}\alpha_{ji}v_i + \sum_{i=1}^{k}\alpha_{ji}^2 \}$$
And the rest of the term expansion makes sense after that.