From page 219, 'Machine Learning' by Murphy:
$$\frac{1}{2}(\mathbf{y}-\mathbf{Xw})^T(\mathbf{y}-\mathbf{Xw})=\frac{1}{2}\mathbf{w}^T(\mathbf{X}^T\mathbf{X})\mathbf{w}-\mathbf{w}^T(\mathbf{X}^T\mathbf{y}).$$
Is there an easy was to see that this equality holds, without having to write out generic elements of both sides and seeing that they turn out to be the same?
Asked my lecturer, and it turns out it's a misprint - the RHS is missing a $\frac{1}{2}\mathbf{y}^T\mathbf{y}$ term. Now it makes sense!