I have a question about this algorithm to find the spectral norm via power iteration from the paper Spectral Normalization for Generative Adversarial Networks.
I don't know if I understood it correctly: $v$ approximates the dominant eigenvector of $W^\mathbf{\top}$, and $u$ approximates the dominant eigenvector of $W W^\mathbf{\top}$. The largest singular value of $W$ is $u^\mathbf{\top} Wv$, but why is this the case?
More specifically, I thought that $v$ should be the dominant eigenvector of $W^\mathbf{\top}W$ instead of $W^\mathbf{\top}$, according to the SVD of $W$ (in other words, $v$ is the first right singular vector of the matrix $W=USV^\mathbf{\top}$.
Please let me know if I can revise the question for clarity. Thank you!

Indeed, as you suggest this algorithm should work, $\tilde{u}$ and $\tilde{v}$ will converge to the dominant left- and right- singular vectors of $W$, which are also the eigenvectors of $W W^\top$ and $W^\top W$ (under the stated assumptions). $v$ does not approximate the dominant eigenvector of $W^\top$--indeed, this algorithm will work if $W^\top$ is rectangular and thus not possessing any eigenvectors!
If we forget about $\tilde{v}$ for a moment, if one applies the update rule (18) and considers the impact on $\tilde{u}$ alone, we see that the update takes the form $\tilde{u} \leftarrow WW^\top \tilde{u} / \| WW^\top \tilde{u} \|_2$. This is nothing but the classic power method applied to $WW^\top$ and thus $\tilde{u}$ converges to the dominant eigenvector $u$ of $WW^\top$ (equivalently the dominant left singular vector of $W$). But then we have that $W^\top u = \sigma(W) v$ where $v$ is the dominant right singular vector, so $\tilde{v} = W^\top \tilde{u} / \| W^\top \tilde{u} \|$ converges to $W^\top u / \| W^\top u \|_2 = \sigma(W) v / \| \sigma(W) v\|_2 = v$. Thus, we conclude that
$$ \tilde{u}^\top W \tilde{v} \mbox{ converges to } u^\top W v = (W^\top u)^\top v = \sigma(W) v^\top v = \sigma(W). $$