Resolution of Sum in $\sum_{i} \|\sum_j w_{ij}(\mathbf{x}_i - \mathbf{x}_j) \|^2$

57 Views Asked by At

I am following the derivation of optimal weights for the LLE (local linear) embedding algorithm. In the derivation I encountered this specific identity which I could not understand:

$$ \sum_{i} \Big\|\sum_j w_{ij}(\mathbf{x}_i - \mathbf{x}_j)\Bigr\|^2 = \sum_{i} \sum_{j,k} w_{ij} w_{ik}(\mathbf{x}_i - \mathbf{x}_j)^T (\mathbf{x}_i - \mathbf{x}_k) $$

Some context:

  • Vectors $\mathbf{x} \in \mathbb{R}^m$ refer to the input vectors.
  • $w$ are scalar elements of the weight matrix $ W \in \mathbb{R}^{m \times n} $

I am looking for the steps needed to get from the left to the right hand side, more precisely how to deal with the sum in the $2$ norm.

1

There are 1 best solutions below

0
On BEST ANSWER

This is nothing more than using the definition of the Euclidean norm and bilinearity of the inner product on $\mathbb R^m$ $$ \left\| \sum_j w_{ij}(x_i-x_j)\right\|^2 =\left( \sum_j w_{ij}(x_i-x_j)\right)^T\left( \sum_k w_{ik}(x_i-x_k)\right) \\= \sum_j \sum_k w_{ij}w_{ik}(x_i-x_j)^T(x_i-x_k) $$