if we have any real matrix M nXm, the SVD (singular value decomposition) allows us to decompose it into $U{\Sigma}V^T$, where V is an orthogonal real matrix composed of the eigenvectors of $M^TM$.
from wikipedia, I know that:
which, to my understanding, should prove that the eigenvectors of $M^TM$ are the column of V in SVD. but I feel like I am missing something - it seems to me like we never proved that V's vectors are eigenvectors, only that they fit the diagonalization format. but there might be a different, non-eigenvector matrix that when multiplied by a different, non-eigenvalue matrix and then by itself transposed gives M*M, right?
if so, we can use it for the SVD and it will still fit. which brings me to my question: How do we know V in the SVD is the eigenvectors of M*M?

No, this is not possible. Here is a simple more general statement which should solve your doubt.
Let $A$ be an arbitrary (square) matrix, $V$ an invertible matrix and $D$ a diagonal matrix such that $A=VDV^{-1}$. Then the columns of $V$ are all the eigenvectors of $A$ and the diagonal entries of $D$ are the eigenvalues.
Proof: just rewrite the equation slightly to get $AV=VD$, and then consider a single column $v_i$ of $v$ to get $Av_i=D_{ii}v_i$. $\square$
Note: