Eigenvalues and eigenvectors can be thought of in two complementary ways: a defining equation, and a matrix factorization.
A lot of intuition about eigenvalues and eigenvectors can come from considering the defining equation $Ax=\lambda x$. E.g., when $x$ is an eigenvector of $A$, operation by $A$ has the effect of scaling $x$.
Eigenvalues and eigenvectors can also be thought of (when $A$ is symmetric) as a matrix factorization that can be written $Q \Lambda Q^T$, where the columns of $Q$ are eigenvectors and $\Lambda$ is a diagonal matrix of eigenvalues.
The SVD is invariably given as a matrix factorization $U \Sigma V^*$. So, is there a defining equation for the SVD corresponding to the equation $Ax=\lambda x$ enjoyed by eigenvalues and eigenvectors?
Strang would say that the "SVD equation" is $$ Av = \sigma u $$ where $\sigma$ is a singular value, and $u,v$ are the corresponding columns of $U$ and $V$ respectively. That is, $v$ is a unit vector that, when acted upon by $A$, produces a vector of length $\sigma$ in the direction of $u$.
Note, however, that it's not enough to simply find non-zero solutions $u,v$ to this equation. The difficult constraint of SVD is that the vectors $v_1,v_2,\dots$ and $u_1,u_2,\dots$ must form orthonormal bases of the domain and codomain respectively. So unfortunately, "solving" this equation in the traditional sense doesn't lead to a useful SVD algorithm.