My problem:
Assuming A is a $n \times n$ matrix and can be expressed as $A=U\Sigma V^T = \sum_{i=1}^r \sigma_i u_i v_i^T$ and it is not invertible.
Given $B=\sum_{i=1}^r \frac{1}{\sigma_i} v_i u_i^T$, show that $BAx = x$ for all $x = \sum_{i=1}^r \alpha_iv_i$.
What I have done so far:
I rewrote $B$ as $B = V\Sigma^+U^T$ where $\Sigma^+$ is a $n \times n$ diagonal matrix with: $\sigma^+_i = \begin{cases} \sigma^{-1}_i & \textit{if } i \leq r \\ 0 & \textit{otherwise} \end{cases}$
I rewrote $x$ as $x = \sum_{i=1}^r \alpha_i v_i = \bar{\alpha} V$ where $\bar{\alpha} = \begin{pmatrix} \alpha_1, & \alpha_2, & \dots, & \alpha_r, & 0, & \dots, & 0 \end{pmatrix}^T = \begin{pmatrix} \underset{(rx1)}{\bar{\alpha}}\\ 0 \end{pmatrix} \in \mathcal{R}^n$
Question 1:
I used $\underset{n \times n}{V} = \begin{pmatrix} \underset{(r\times r)}{V} & 0\\ 0 & 0 \end{pmatrix}$, $\underset{n \times n}{\Sigma} = \begin{pmatrix} \underset{(r\times r)}{\Sigma} & 0\\ 0 & 0 \end{pmatrix}$, $\underset{n \times n}{\Sigma^+} = \begin{pmatrix} \underset{(r\times r)}{\Sigma^+} & 0\\ 0 & 0 \end{pmatrix}$ because I assumed that besides the index $r$, the other values of $\Sigma$ would be all $0$ (in the definition the summatory was until $r$ and $r \leq n$).
Is it correct? If not why?
- I substituted the new terms in the expression:
$BAx = x \rightarrow (V\Sigma^+ U^T) (U \Sigma V^T) x = x \rightarrow (V \Sigma^+) (\Sigma V^T) \bar{\alpha} V = \bar{\alpha} V \rightarrow$
$\rightarrow V \underbrace{ \begin{pmatrix} \underset{(r\times r)}{\Sigma^{-1}} & 0\\ 0 & 0 \end{pmatrix} \begin{pmatrix} \underset{(r\times r)}{\Sigma} & 0\\ 0 & 0 \end{pmatrix}}_{\mathcal{I}_{r\times r}} V^T \bar{\alpha} V = \bar{\alpha} V \rightarrow$
$\rightarrow \underbrace{ \begin{pmatrix} \underset{(r\times r)}{V} & 0\\ 0 & 0 \end{pmatrix} \begin{pmatrix} \underset{(r\times r)}{\mathcal{I}} & 0\\ 0 & 0 \end{pmatrix} \begin{pmatrix} \underset{(r\times r)}{V^T} & 0\\ 0 & 0 \end{pmatrix}}_{\mathcal{I}_{r\times r}} \bar{\alpha} V = \bar{\alpha} V \rightarrow \bar{\alpha} V = \bar{\alpha} V$
Question 2
Is it correct? If not, why? How can I prove the assignment?
Thanks for your help.
Regarding question 1: you should write $x = V \bar \alpha$ rather than $x = \bar \alpha V$; note that $x$ must be a column vector. Also, we have no reason to assume that $U$ and $V$ have zero-rows and zero-columns.
Regarding the problem itself, here is a much simpler proof. If $x = \sum_{i=1}^r \alpha_i v_i$, then we can write $$ \begin{align} Ax & = \left(\sum_{i=1}^r \sigma_i u_iv_i^T \right)\left(\sum_{j=1}^r \alpha_j v_j \right) \\ & = \sum_{i=1}^r \sum_{j=1}^r \alpha_j\sigma_i \cdot u_i(v_i^Tv_j) \\ & = \sum_{i=1}^r \alpha_i\sigma_i \cdot u_i, \end{align} $$ and similarly $$ \begin{align} BAx & = B(Ax) = \left(\sum_{i=1}^r \frac 1{\sigma_i} v_iu_i^T \right)\left(\sum_{j=1}^r \alpha_j \sigma_j u_j \right) \\ & = \sum_{i=1}^r \sum_{j=1}^r \frac{\alpha_j \sigma_j}{\sigma_i} v_i(u_i^Tu_j) \\ & = \sum_{i=1}^r \frac{\alpha_i \sigma_i}{\sigma_i} v_i = x, \end{align} $$ which was what we wanted.
If you prefer to work with matrices, you could also have used $x = V \bar \alpha$ and $B = V \Sigma^+ U^T$ to find that $$ \begin{align} BAx & = (V \Sigma^+ U^T)(U\Sigma V^T)(V\bar \alpha) \\ & = V \Sigma^+ (U^TU)\Sigma (V^TV)\bar \alpha \\ & = V (\Sigma^+ \Sigma) \bar \alpha \\ & = V \bar \alpha = x. \end{align} $$ However, the fact that $U^TU, V^TV$ are identity matrices and that $(\Sigma^+ \Sigma) \bar \alpha = \bar \alpha$ should be justified.