I have a rank-$1$ matrix $A \in \mathbb{R}^{m \times n}$ and a vector $u$ in its image. I could prove that the columns of $A$ are multiples of a vector $u$, and that $A$ can be written as $A = \alpha u v^t$, with $u$ and $v$ unit vectors and $\alpha > 0$. I would now like to show that there exist two orthogonal matrices, $U \in \mathbb{R}^{n \times n}$ and $V \in \mathbb{R}^{m \times m}$, such that they have $u$ and $v$ as their first columns, respectively. The idea of this would be to prove that all rank-$1$ matrices have an SVD decomposition.
Would it be correct to take U and V to be the identity matrix with $u$ and $v$ in place of the first column?
If I have that $U$ and $V$, is it posible to apply those directly into $U \Sigma V^t$?
So far I know $\Sigma$ only has the value $\alpha^2$.
If $u$ is a unit vector, then it is often taken for granted that we could, if we wanted to, construct an orthonormal basis whose first vector is $u$; using these vectors as the columns of a matrix would produce an orthogonal matrix.
A non-concrete justification of this fact is that if $e_1,\dots,e_n$ denotes the standard basis, then $\{u,e_1,\dots,e_n\}$ is a spanning set of $\Bbb R^n$ (or $\Bbb C^n$). It follows that we can find a linearly independent subset (which includes $u$) and which spans $\Bbb R^n$. Applying the Gram Schmidt process to the basis $(u,e_{i_1},\dots,e_{i_{n-1}})$ produces an orthonormal basis whose first element is $u$.
There are more "practical" approaches, however. For instance, in applications where we might want an orthogonal matrix whose first column is $u$, it is common to use the Householder transformation which maps $e_1$ to $u$.