I'm trying to get an intuitive understanding of what an SVD decomposition does to an image.
From my understanding, for an image $A \in \Bbb R^{m \times n}$, the singular values are the roots of the eigenvalues of $A^TA$ (or $AA^T$), the left singular vectors are the eigenvectors of $AA^T$ and right singular vectors the eigenvectors of $A^TA$.
What does, for instance, just the first left and right singular vector mean in an image and why are we looking at eigenvectors of $AA^T$ or $A^TA$ - how does this relate to an image? Many thanks!
For example, if we look at this image:

Looking at the first left singular vector shows an approximation of the intensity values in the image rows:

Hopefully that makes what I'm asking a little clearer: why does the eigenvectors of $AA^T$ show this relationship?
Here's an idea that you might like: let $\sigma_j$ denote the singular values (diagonal entries of $\Sigma$), $u_j$ the left singular vectors (columns of $U$), $v_j$ the right singular vectors (diagonal entries of $V$). We then have $$ A = U \Sigma V^T = \sum_{j=1}^{\min\{m,n\}} \sigma_j u_jv_j^T $$ That is, we've expressed $A$ as a linear combination (with positive coefficients) of some set of rank-1 matrices $u_jv_j^T$. These matrices are "mutually orthogonal" in some sense.
Since the singular values are ranked from largest to smallest, the matrices $u_j v_j^T$ are ranked in decreasing order of the prominence of their contribution to $A$.
A common trick in image processing, then, is to only look at the first few. That is, we make the approximation $$ A \approx \sum_{j=1}^k \sigma_j u_j v_j^T $$ for some $k < \min\{m,n\}$.