Minimazation problem with norm and matrix

43 Views Asked by At

In the context of principal component analysis, I got to the minimazation problem:

$$\min_{A, (v_{i})} \sum_{i=0}^n \left\lVert X_{i}-Av_{i}\right\rVert^2$$

for $X_1,...,X_n\in \mathbb{R}^p$

For the mean $\bar{X}=0$ it has the solutions:

$$\hat{v_{i}}=\hat{A^T}X_{i}$$ $$\hat{A}=(w_1,...,w_q)$$

where $W=(w_1,...,w_p)\in \mathbb{R^{p\times p}}$, and $W*X*W^T$ the eigenvaluedecomposition.


Its a long time I didnt do calculus and optimazation. I tried to compute the derivative and got (with f being the function we want to minimize):

$$\frac{\partial{f}}{\partial{A}}= \sum_{i=0}^n 2*\left\lVert X_{i}-Av_{i}\right\rVert * v_k^T$$

and

$$\frac{\partial{f}}{\partial{v_k}} = 2*\left\lVert X_{k}-Av_{k}\right\rVert * A$$

Setting this zero, I really have no clue how to get to the seolution.