How to prove matrix geometric convergence to any matrix?

111 Views Asked by At

Suppose I have two vectors $x$ and $v$, and we want to calculate the following expression: $$(I+x\cdot v^{T})^{-1}$$ My professor affirmed that we could treat this as a "geometric progression" $$(I-U)^{-1} = I + U + U^2 + U^3 + \ ...$$ but instead, lets replace $U$ by $-U$ $$(I+U)^{-1} = I - U + U^2 - U^3 + \ ...$$ Also, by observing that $$(x \cdot v^{T})^{2} = x \cdot v^{T} \cdot x \cdot v^{T}$$ but $v^{T} \cdot x$ is a scalar, lets call it $c$. Rearranging... $$(x \cdot v^{T})^{2} = c \cdot x \cdot v^{T}$$ Therefore, we can conclude $$(x \cdot v^{T})^{r} = c^{r-1} \cdot x \cdot v^{T}$$

Now, (here is the 1st question), somehow that expression $$(I+U)^{-1} = I - U + U^2 - U^3 + \ ...$$ to infer that $$(I+x\cdot v^{T})^{-1} = I - (1+c)^{-1} \cdot x\cdot v^T$$

I did a couple of examples and it actually works (of course). Than I dug in, I found on google that to that convergence happen $U$ must satisfy some conditions, for example, being a Convergence Matrix.

My 2nd question is: Why this works even though $U$ seems to have no restriction?

1

There are 1 best solutions below

2
On BEST ANSWER

This does not work for all $x,v$. The resulting matrix $I-xv^T$ needs to be invertible.

If for instance $x=v= \pmatrix{1\\0\\ \vdots \\ 0 }$, then $I-xv^T$ is not invertible.

On the other hand, if $I-xv^T$ is invertible, then the inverse is given by your formula $$ (I-xv^T)^{-1} = I + \frac1{1 - x^Tv}x^Tv $$ since $$ (I-xv^T)(I + \frac1{1 - x^Tv}xv^T)= I-xv^T + \frac1{1 - x^Tv}xv^T -\frac{x^Tv}{1 - x^Tv}xv^T=I. $$ You might want to check the Sherman-Morrison formula