Suppose that we have $\mathbf{x}_1, \mathbf{x}_2, \ldots, \mathbf{x}_n$ centered points in $m$ dimensional space. Let $\mathbf{v}$ denote the unit vector along which we project our $\mathbf{x}$'s. The length of the projection $y_i$ of $\mathbf{x}_i$ is $y_i = \mathbf{x}_i^⊤ \mathbf{v}$. The mean squared projection is the variance $V$ summed over all points $\mathbf{x}_i$:
$$ \begin{align*} Var &= \frac{1}{n} \sum_{i=1}^n y_i^2 = \frac{1}{n}\sum_{i=1}^n\left(\mathbf{x}_i^⊤ \mathbf{v}\right)^2\\ &=\frac{1}{n}\sum_{i=1} \mathbf{x}_i^⊤ \mathbf{v} \cdot\mathbf{x}_i^⊤ \mathbf{v} = \frac{1}{n}\sum_{i=1} \mathbf{v}^⊤ \mathbf{x}_i \cdot\mathbf{x}_i^⊤ \mathbf{v}\\ &= \mathbf{v}^⊤ \underbrace{\left(\frac{1}{n}\sum_{i=1}^n \mathbf{x}_i \mathbf{x}_i^⊤\right)}_{\text{Covariance matrix}}\mathbf{v} = \mathbf{v}^⊤ C \mathbf{v} \end{align*} $$
The step that puzzles me is where we from $\frac{1}{n}\sum_{i=1}^n \mathbf{v}^⊤ \mathbf{x}_i \cdot\mathbf{x}_i^⊤ \mathbf{v}$ to the next one. Aren't $\mathbf{v}^⊤ \mathbf{x}_i$ and $\mathbf{x}_i^⊤ \mathbf{v}$ dot products? How can we split the scalar product of two dot products and sum over the $\mathbf{x}_i \mathbf{x}_i^⊤$? Is the product inside the sum of the middle factor a tensor product?