Rows of V in reduced SVD with norm 1

30 Views Asked by At

Suppose, we're given the reduced/compact SVD of the rank-$r$ Matrix

$A=USV^T$

where $U\in\mathbb{R}^{m\times r}$, $S\in\mathbb{R}^{r\times r}$ and $V\in\mathbb{R}^{n\times r}$ and suppose the $i$-th column of $A$ ($A_i$) is not in the span of the other columns $A_1,...,A_{i-1},A_{i+1},...,A_n$ (linearly independent), then show that the $i$-th row of $V$ is of norm 1, $||(V^T)_i||=1$.

I came across the problem myself, while trying to implement some stuff. I'm not sure if the statement holds but I couldn't find a counterexample so far. Possibly, we could show stronger implications but this would suffice my purposes. Does anyone have an idea? Thanks!

1

There are 1 best solutions below

0
On

I think I found a proof. For anyone who might be interested:

Consider the compact SVD of the rank-$r$ matrix $A=USV^T$ with $U\in \mathbb{R}^{m\times r}$, $S\in \mathbb{R}^{r\times r}$ and $V\in \mathbb{R}^{n\times r}$. For the $i$-th column of $A$, $A_i$, and the $i$-th row of $V$, $v_i$, it then holds that $A_i\notin \text{Span}(A_1,\dots,A_{i-1},A_{i+1},\dots,A_n)$ if and only if $\langle v_i,v_j\rangle =\delta_{ij}\enspace \forall j=1,\dots,n$.

($\Rightarrow$) Consider $k=[k_1,\dots,k_n]^T\in \ker(A)$ for which holds \begin{align}\label{Lemmaoneweek} 0=Ak=\begin{bmatrix} a_{11}k_1+\dots+a_{1n}k_n\\\vdots \\a_{m1}k_1+\dots +a_{mn}k_n \end{bmatrix}=k_1A_1+\dots+k_iA_i+\dots+k_nA_n. \end{align} Furthermore, the following implications hold: \begin{align*} &A_i\notin \text{Span}(A_1,\dots,A_{i-1},A_{i+1},\dots,A_n)\\ \Rightarrow& A_i\neq \sum_{j\neq i}\lambda_j A_j \enspace \forall \lambda_j\in \mathbb{R}, j=1,\dots,i-1,i+1,\dots ,n\\ \Rightarrow&\left(0=\lambda A_i+\sum_{j\neq i}\lambda_jA_j,\enspace \lambda\in \mathbb{R}\enspace \Rightarrow \lambda=0\right). \end{align*}
Combining that with the kernel equation above, we obtain that if $A_i$ is not in the span of the other columns of $A$ then $k_i$ is zero for any arbitrary kernel element $k$. Thus, it also holds for any basis vector of any basis of the kernel of $A$. Therefore, if we were to complete $V$ such that it becomes an orthonormal basis of $\mathbb{R}^n$ the vectors to be appended all have a zero entry in the $i$-th entry. This means that the $i$-th row of $V$ already was of norm $1$ and because the rows of the completed $V$ are orthonormal to all other rows, the $i$-th row of the compact $V$ must already be orthogonal to all others.

($\Leftarrow$) Reciprocally, suppose that $A_i\in \text{Span}(A_1,\dots,A_{i-1},A_{i+1},\dots,A_n)$, then this implies that \begin{align*} \exists \lambda_1,\dots,\lambda_{i-1},\lambda_{i+1},\dots,\lambda_n\in \mathbb{R}\enspace :\enspace 0=-A_i+\sum_{j\neq i}\lambda_j A_j. \end{align*} Choosing $k=[\lambda_1,\dots,\lambda_{i-1},-1,\lambda_{i+1},\dots,\lambda_n]^T$ yields that $k\in \ker(A)$ with the $i$-th component non-zero. This means that for every basis of the kernel of $A$ there must exist a basis vector with the $i$-th component non-zero. Again, if we were to complete $V$ to be an orthonormal basis of $\mathbb{R}^n$ it would append an element to the $i$-th row of $V$ that is non zero which means that $v_i$ was not of norm $1$.