I am reading about Singular Value Decomposition (SVD) from book SVD CSTheory Infoage.
At page 6, the chapter says:
A matrix $A$ can be described fully by how it transforms the vectors $v_i$. Every vector $v$ can be written as a linear combination of $v_1, v_2, ... v_r$ and a vector perpendicular to all the $v_i$.
Now, $A * v$ is the same linear combination of $A * v_1, A * v_2, .... A *v_r$ as $v$ is of $v_1, v_2, ... v_r$
So the $A * v_1, A * v_2, .... A *v_r$ form a fundamental set of vectors associated with A.
Here $v_j$'s are right singular-vectors.
I could not understand why is a more vector perpendicular to the $v_i$ is required ?
Note: Question is related to the bold part of the above text.
Let $E=span(v_1,\cdots,v_r)$ and $F$ be its orthogonal. Then $\mathbb{R}^d= E\oplus F$. If $v\in\mathbb{R}^d$, then $v=\sum_i a_iv_i+w$ where $w\in F$. Since $Aw=0$, $Av=\sum_i a_iAv_i$ and $im(A)=span(Av_1,\cdots, Av_r)$ and $r=rank(A)$.