vector summation done in matrix form (linear algebra)

207 Views Asked by At

I want to prove the following

$\sum_{i=1}^{n} (\vec{x_i} \cdot \vec{w})^2 = (xw)^T (xw)$

where $\vec{x_i}$ is a $1\times p$ vector and $\vec{w}$ is a $p\times 1$ vector. If we stack our $n$ data vectors $\vec{x_i}$ we get an $n\times p$ matrix that we denote by $x$

a simple example for these unknowns (with $n=p=2$) might be:

$\vec{x_1} = <1,2>$ $\vec{x_2} = <3,4>$

$x = \begin{bmatrix} 1& 2\\ 3& 4 \end{bmatrix}$

$\vec{w} = \begin{bmatrix} 2 \\ 2 \end{bmatrix} $

By replacing both sides of the first equation with this example, I was able to verify the equality. However, I am looking for a mathematical proof or an explanation of the linear algebra rules that were used to transform the left hand side of the equation to the right hand side

1

There are 1 best solutions below

2
On BEST ANSWER

Let's let $x_{ij}$ denote the $j$th entry of $x_i$. Then it also happens to denote the entry in row $i$, column $j$ of the matrix you're calling $x$. Using $w_k$ to denote the $k$th entry of $w$, the left hand side is \begin{align} \sum_i (x_i \cdot w)^2 &=\sum_i \left(\sum_k [(x_i)_k w_k]^2 \right) \\ &=\sum_i \left(\sum_k (x_{ik} w_k)^2 \right)\\ &=\sum_i \left[ (\sum_k (x_{ik} w_k)) (\sum_p (x_{ip} w_p)) \right] \\ &=\sum_i ( (xw)_i) ( (xw)_i) \\ &=(xw) \cdot (xw)\\ &=(xw)^t (xw)\\ \end{align} where the last step comes from the fact that for any two column vectors $u$ and $v$, $u \cdot v = u^t v$, and $(xw)_i$ denotes the $i$th entry of the matrix-vector product $xw$.