I want to understand the relationship between the SVD of an arbitrary $n \times p$ matrix $X$ and the spectral decomposition of $X^TX$.
My understanding:
Using SVD we can write
$$ X = ULV^T$$
where $U$ and $V$ are orthogonal matrices.
And we can then write:
$$X^TX = V(L^TL)V^T $$
Then by the spectral decomposition, using that $X^TX$ is symmetric, we can write:
$$X^TX = E\Lambda E^T$$
where $E$ is the matrix with columns given by the eigenvectors of $X^TX$.
Does this then imply that the columns of $V$ are also eigenvectors of $X^TX$?
Typically one does these in the reverse steps. Ending up at the SVD $X=ULV^T$ by starting with spectral decompositions of $X^TX$ and $XX^T=(X^T)^TX^T$.
Note that if $Q$ is an $r\times r$ diagonal matrix with diagonal entries $q_1,\ldots, q_r$ and $P$ is an orthogonal matrix with columns $p_i$, $$\text{col}(PQP^T)=\text{span}\{p_i:q_i\neq 0\}=\text{span}\{p_i:q_i=0\}^\perp=\text{null}(PQP^T)^\perp,$$ where $\text{col}(M)$ and $\text{null}(M)$ denote the column and null spaces of the matrix $M$, respectively. Moreover, the dimension of the column space is the number of non-zero eigenvalues repeated according to multiplicity.
To start, let $X$ be $m\times n$. To avoid a trivial case, assume $X\neq 0$. Apply the spectral theorem to $X^TX$ to get $X^TX=VDV^T$, where $D$ is a diagonal matrix with non-negative entries and $V$ is an orthogonal matrix. Let $d_1,\ldots, d_n$ be the diagonal entries of $D$ and assume without loss of generality that $d_1\geqslant \ldots \geqslant d_n$.
Let $p$ be the maximum $i\in \{1,\ldots, p\}$ such that $d_i\neq 0$. Note that $p\leqslant \text{rank}(X^TX)\leqslant \min\{m,n\}$. Since we've enumerated the eigenvalues is non-increasing order, we have $d_1\geqslant \ldots \geqslant d_p>0=d_{p+1}=\ldots = d_n$.
For each $1\leqslant i\leqslant p$, let $w_i=Xv_i$. Then $$X^Tw_i = X^TXv_i=d_iv_i\neq 0,$$ which implies $w_i\neq 0$. Note also that $$XX^T w_i = X(X^TXv_i) = X(d_iv_i)=d_iXv_i=d_i w_i.$$ So $w_1,\ldots, w_p$ are eigenvectors of $XX^T$ for the eigenvalues $d_1,\ldots, d_p$, respectively. This shows that the non-zero eigenvalues of $X^TX$ are eigenvalues of $XX^T$, with at least as large a multiplicity as an eigenvalue of $XX^T$ as its multiplicity as an eigenvalue of $X^TX$. By symmetry, $X^TX$ and $XX^T$ have the same non-zero eigenvalues with the same multiplicities.
For $1\leqslant i,j\leqslant p$ with $i\neq j$, $$w_i^Tw_j = (Xv_i)^T (Xv_j)=v_i^T X^TX v_j = v_i^T (d_jv_j)=d_j v_i^Tv_j=0,$$ since $v_i,v_j$ are orthogonal. This implies that $w_1,\ldots, w_p$ are orthogonal. Note that $$\|w_i\|^2=\|Xv_i\|^2 = (Xv_i)^T(Xv_i)=v_i^T X^TXv_i=d_i v_i^Tv_i=d_i.$$ Let $u_i=w_i/\sqrt{d_i}$ for $i=1,\ldots, p$, so $\|u_i\|=1$. Note also that $Xv_i=w_i=\sqrt{d_i}u_i$. Let $u_{p+1},\ldots, u_m$ be an extension to an orthonormal basis for $\mathbb{R}^m$. Let $E$ be the $m\times m$ diagonal matrix with entries $d_1,\ldots, d_p, 0,\ldots, 0$. As we noted above, $$\mathbb{R}^m=\text{col}(XX^T)\oplus \text{null}(XX^T),$$ and the dimension of the column space is the number of non-zero eigenvalues repeated according to multiplicity which is $p$ in this case. Therefore $u_1,\ldots, u_p$ is a basis for the column space, since they are in the column space and linearly independent, and $u_{p+1},\ldots, u_n$ are a basis for $$\text{span}\{u_1,\ldots, u_p\}^\perp=\text{col}(XX^T)^\perp = \text{null}(XX^T).$$ From this we can easily check that $UEU^T u_i=d_iu_i=XX^Tu_i$ if $i\leqslant p$ and $UEU^Tu_i=0=XX^Tu_i$ for $p<i\leqslant m$. So $XX^T=UEU^T$.
This gives us lots of information about the eigenvalues/eigenvectors of $X^TX$ and $XX^T$, and how they relate to each other.
If $L$ is the $m\times n$ matrix whose $i,i$ entry is $\sqrt{d_i}$ for $1\leqslant i\leqslant p$, and all of the other entries are zero, then $$X=ULV^T.$$ This is because for $v\in\mathbb{R}^n$, for $p<i\leqslant n$, $X^TXv_i=0$, so $$v^T_i X^TXv_i = (Xv_i)^TXv_i=\|Xv_i\|^2,$$ so $Xv_i=0$. Therefore for $v=\sum_{i=1}^n a_iv_i$, $$Xv=\sum_{i=1}^n a_iXv_i=\sum_{i=1}^p a_i\sqrt{d_i}u_i=ULV^Tv.$$ So $X=ULV^T$.