Let $A$ be a $p\times q$ matrix, with rank $q$. Show that the vector $x$ that minimizes $\|Ax\|_2$ under the constraint $\|x\|_2 = 1$ is the right singular vector of $A$ corresponding to the smallest singular value.
Minimization using Singular value
671 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Let $$A = U \Sigma V^*$$ be the SVD of A. Then, because $ U$ is orthogonal, $$ ||Ax||= ||\Sigma V^* x|| $$ If $V^*x = y$ , $ ||y||=||x||=1$ we are left with $$||Ax||^2=||\Sigma y||^2 = \sum _i |\sigma_iy_i|^2 \geq \sum _i |\sigma_qy_i|^2 = \sigma_q^2 $$ So, the inequality holds. If $x$ is a right singular vector of $A$ then it is one of the columns of $V$. But $V$ is orthogonal, so there $i$ such that $V^*x = e_i$, hence$||Ax|| = ||\Sigma V^* x|| = ||\Sigma e_i|| = \sigma_i$. So, any singular vector of $\sigma_q$ minimises $||Ax||$ and the result follows. Note that it is unaccurate to talk of the minimiser of $||Ax||$, as the singular vectors are not unique ( we can multiply $x$ by -1).
Finally, note that every minimiser of $||Ax||$ must be a singular vector. To do this, adapt the proof of the inequality!
Hint: let $x \in \mathbb{R}^q$ and write it as a linear combination of right singular vectors of $A$. Notice that $Ax$ is a linear combination of the corresponding left singular vectors of $A$. But these are orthogonal, so you have the Pythagorean theorem, which makes it straightforward to solve the minimization problem. In particular you should find that if $x=\sum c_i v_i$ then $Ax=\sum c_i \sigma_i u_i$ so $\| Ax \|_2^2 = \sum c_i^2 \sigma_i^2$.