I'm doing an experiment on image compression based on SVD which lies on the statement that singular vectors (the ones gotten when doing SVD on a integer matrix) u and v take only values in range [-1 1].
I've looked in Gilbert Strangs book 'Introduction to linear algebra' but haven't found proof for the statement.
Does someone know a formal publication I can reference against on that statement?
Does that statement arise from some other characteristics of eigenvectors?
This is trivial, so finding a reference may be harder than proving it. Let $A = U \Sigma V^*$. Singular vectors are columns of unitary (orthogonal, in real case) matrices. Let's focus on $V$ (same goes for $U$) with columns $v_j$ and elements $v_{ij}$. We know that
$$V^*V = I,$$
but what are the elements of $V^*V$? These are scalar products of columns:
$$v_i^* v_j = (V^*V)_{ij} = \delta_{ij} \quad \text{(Kronecker $\delta$)}.$$
Specially, this tells us that $v_j^*v_j = 1$ for all $j$ (hence the name orthonormal or unitary matrix). Now we just need to expand what is $v_j^*v_j$:
$$1 = v_j^* v_j = \sum_{i=1}^n \overline{v}_{ij} v_{ij} = \sum_{i=1}^n |v_{ij}|^2.$$
Since $|v_{ij}|^2 \ge 0$ and the sum is $1$, we conclude that $|v_{ij}| \le 1$ for all $i,j$. In real case, that means $v_{ij} \in [-1,1]$ for all $i,j$.