Rate of convergence of projection matrix

112 Views Asked by At

For $0 \le \epsilon \ll 1$ and matrices $A \in \mathbb R^{n \times p}$ and $B \in \mathbb R^{m \times p}$ define $$C(\epsilon, A, B) = \begin{bmatrix} A\\ \epsilon B\end{bmatrix} \in \mathbb R^{(n+m) \times p}.$$

Now, for $1 \le r \le \text{rank}(A)$, define the projection operator $\Pi_r(A) := \sum_{i=1}^rv_iv_i^T$, where $A = U\Sigma V^T = \sum_{i}\sigma_i u_iv_i^T$ is the singular-value decomposition (SVD) of $A$. It's intuitively clear that $\Pi_r(C(\epsilon, A, B)) - \Pi_r(A) \rightarrow 0$ (e.g in the spectral norm) when $\epsilon \rightarrow 0^+$.

Question: What's a good estimate for the error $\|\Pi_r(C(\epsilon, A, B)) - \Pi_r(A)\|$ as a function of $\epsilon$. Is it: $\mathcal{O}(\sqrt{\epsilon})$ ? $\mathcal{O}(\epsilon)$ ? $\mathcal{O}(\epsilon^2)$ ?