Let $\textbf{A} \in \mathbb{C}^{n \times n} $, such that $rank(\textbf{A}) = r$ and the singular values of $\textbf{A}$ be $\sigma_{1} \geq \dots \geq \sigma_{r} > 0$.
Let $\textbf{u}_j$ and $\textbf{v}_j$ for $1 \geq j \geq r$ denote the left and right singular vectors of $\textbf{A}$ respectively.
Now, suppose we have a matrix
$$\textbf{B} = \alpha\sum_{j=1}^{k}\textbf{u}_j \sigma_{j} \textbf{v}_j^{*}$$
Where $\alpha \in \mathbb{C}$ and $k < r$.
I want to find $\|\textbf{A} - \textbf{B} \|_{2}$, which is equivalent to the maximum singular value of $\textbf{A} - \textbf{B}$.
$\textbf{Proof Attempt}$
The obvious move to me would be writing $\textbf{A}$ as a sum of outer products.
$$\textbf{A} = \sum_{j=1}^{r}\textbf{u}_j \sigma_{j} \textbf{v}_j^{*}$$
Rewriting $\textbf{A} - \textbf{B}$ gives
$$\textbf{A} - \textbf{B} = (1 - \alpha)\sum_{j=1}^{k}\textbf{u}_j \sigma_{j} \textbf{v}_j^{*} + \sum_{j=k+1}^{r}\textbf{u}_j \sigma_{j} \textbf{v}_j^{*}$$
This is where I get stuck. Because $\alpha$ is an arbitrary complex scalar, I don't know what can be said about how it affects the singular values in the first sum, since they need to be both real and non-negative.