Let $A \in \mathbb{R}^{m \times n}$ be a matrix with singular value decomposition
$$ A = U \tilde{\Sigma} V^{T}. $$
Let $\text{rank}(A) = r \leq \min(m, n)$. My textbook notes that the first $r$ columns of $U$ then constitute a basis for $\mathcal{R}(A)$ (the range of $A$). That makes sense, since $Ax = U \tilde{\Sigma} V^{T} x = U \tilde{x}$ where $\tilde{x} = \tilde{\Sigma}V^{T} x$.
However, my book also states that the first $n-r$ columns of $V$ constitute a basis for $\mathcal{N}(A)$ (the null space of $A$). Shouldn't it instead be that the last $n-r$ rows of $V$ are a basis for $\mathcal{N}(A)$? If we want $V^{T} x = 0$, then multiplication by $x$ must produce a linear combination of the rows of $V$ that sum to the zero vector.
The assertion is true if the singular values are ordered in a way that the first $n-r$ diagonal entries of $\tilde \Sigma$ are zero.
Let $v_1, v_2, \dots $ be the columns of $V$. Let $$v= \sum_{i=1}^{n-r}x_i v_i$$ be a linear combination of the first $n-r$ columns. This means that in the first row of $v$, there's a linear combination of the first row of the $v_i$s, in the second row the second and so on.
Thus we can write $v=Vx$, where $x= \begin{pmatrix}x_1\\ x_2\\ \vdots \\ x_{n-r}\\ 0 \\ \vdots \\0 \end{pmatrix}.$
Then we have that $V^T V = I$, so that
$$Av = U \tilde \Sigma V^T V x = U \tilde \Sigma x.$$
Now, since we required $\tilde \Sigma$ to be of a shape that the left-upper $(n-r)\times (n-r)$ block consists of zeros and this is precisely where $x$ doesn't have any zeros, so we get $$\tilde \Sigma x = 0, \mbox{ hence } Ax = 0.$$
If the zeros wouldn't be on the top of the diagonal of $\tilde \Sigma$, but on the bottom, then indeed, we'd need to look at the last $n-r$ columns of $V$.
Addendum: I'm pretty sure that you should be able to proof this with some kind of nice duality argument (since by $A= U\tilde \Sigma V^T$ we get $A^T = V \tilde \Sigma ^T U,$ which shows that the columns of $V$ span $\mathcal R(A^T) = (\mathcal N(A))^\perp$.)