Given a matrix $A$, and the right singular vectors of the matrix $A$ say call them $v_1,v_2, ..., v_r$
and say an arbitrary vector $v$ which can be written as a linear combination of the vectors of $v_1, v_2, ..., v_r$.
I am just wondering is it ture that
$Av = $$\sum_{i=1}^{r} \sigma_i u_iv_i^T v$
given we know that:
$Av_j = $$\sum_{i=1}^{r} \sigma_i u_iv_i^T v_j$
where $\sigma_i$ are the singular values, and $u_i$ are the left singular vectors
It is the Theorem 1.5 on Page 8 here: https://www.cs.cmu.edu/~venkatg/teaching/CStheory-infoage/book-chapter-4.pdf
If it is correct, then how do we go from the second equality to the first?
could someone kindly explain.
thank you
Let's go: you are looking at the condensate version of SVD, that excludes some useless part of the matrices $U$ and $V$, in general $U$ and $V$ are quadratic matrices, let's reestate the SVD:
If $A$ is a matrix on $\mathbb{R}^{n\times m}$ (that is, a linear transformation from $\mathbb{R}^m$ to $\mathbb{R}^n$) than there are $U \in \mathbb{R}^{n\times n}$ and $V \in \mathbb{R}^{m\times m}$ ortonormal matrices and a matrix $\Sigma \in \mathbb{R}^{n\times m}$ full of zeros but with diagonal $\sigma_1, \cdots , \sigma_r$ where $r = rank(A)$ such that $A = U\Sigma V^T$.
All we do to reach the condensate form is to realize that we just need the first $r$ columns of $U$ and $V$, since on the product $A = U\Sigma V^T$ the other columns do nothing.
Now think about the SVD in the general form: that it means to $v_i$ to be a right singular vector? $$A v_i = U\Sigma V^T v_i$$ Looking for $V^Tv_i$ and using the fact that $V$ is ortonormal (note that $v_i^Tv_j = \langle v_i, v_j\rangle$ and that since $v$ is ortonormal than $v_i^Tv_j = 1$ if $i = j$ and $0$ if $i \neq j$). We got: $$ V^Tv_i = \begin{bmatrix} v_1^T \\ \vdots \\ v_n^T \end{bmatrix} v_i = \begin{bmatrix} v_1^Tv_i \\ \vdots \\ v_n^Tv_i \end{bmatrix} = e_i $$ So: $$A v_i = U\Sigma V^T v_i = U\Sigma e_i = U \sigma_i e_i = u_i \sigma_i$$ if $i\leq r$ or zero if $i>r$. Now take $v = c_1 v_1 + \cdots + c_r v_r + \cdots + c_nv_n$, by linearity: $$ Av = c_1Av_1 + \cdots + c_nAv_n = c_1\sigma_1 u_1 + \cdots + c_r\sigma_r u_r $$. To finish just realize that $v_i^T v = \langle v_i, v\rangle = \langle v_i, c_1 v_1 + \cdots + c_r v_r\rangle = c_i$, and therefore: $$ Av = \sum_{i=1}^r \sigma_i u_i v_i^Tv $$