Bound for the inverse of a summation of rank-1 matrices

103 Views Asked by At

Given vectors $x_1,\dots,x_T \in R^d$ satisfying $\|x_i\|_2 = 1$, define $A_0 = I$ and $A_t = I + \sum_{i=1}^tx_ix_i^\top$ for $t \geq 1$. We are interested in the following quantity: \begin{align} S_{d,T} = \sum_{i=1}^T\|A_{i-1}^{-1}x_i\|_2. \end{align} If $d = 1$, it is easy to bound $S_{1,T}$ by $\log T$ as follows: \begin{align} S_{1,T} = \sum_{i=1}^T\frac{1}{i} = O(\log T). \end{align} Is it true that for $d\geq 2$, we still have the similar bound $S_{d,T} \leq O(poly(d)\log T)$, where $poly(d)$ represents some polynomial of $d$?

1

There are 1 best solutions below

0
On

I'm not sure my answer is correct.

(1) When $T <= d$, we have \begin{equation} S_{d,T} = \sum_{i=1}^T 1= T. \end{equation} Here we need $x_1,x_2,\dots,x_T$ are orthogonal with each other.

It is obvious that $\|A_0^{-1}x_1\| = \|x_1\| = 1$. According to Sherman–Morrison formula, we have \begin{equation} A_1^{-1} = I - \frac{1}{2}x_1 x_1^T. \end{equation} Thus, based on the definition of matrix norm, we have \begin{equation} \|A_1^{-1}x_2\| \leq \|A_1^{-1}\|_2, \end{equation} Due to $d >=T$, we can found an eigenvector which is orthogonal to $x_1$, and its corresponding eigenvalue is of course $1$, the max eigenvalue of $A$. Thus let $x_2^T x_1 = 0$, the equality in the above equation will be held, i.e., \begin{equation} \|A_1^{-1}x_2\| = 1. \end{equation} Next, for these certain $x_1, x_2$, we can find that \begin{equation} A_2^{-1} = I - \frac{1}{2} x_1 x_1^T - \frac{1}{2} x_2x_2^T. \end{equation} Again, we have \begin{equation} \|A_2^{-1}x_3\| \leq \|A_2^{-1}\|_2. \end{equation} And we can find an eigenvector, which is orthogonal to $x_1$ and $x_2$ (due to $d \ge T$), the equality will be held when setting $x_3$ as this eigenvector: \begin{equation} \|A_2^{-1}x_3\| = 1. \end{equation} The rest can be done in the same manner.

(2) When $d < T <= 2d$ , we have \begin{equation} S_{d,T} = d + \frac{T-d}{2} = \frac{T+d}{2} \end{equation}

For the first $d$ terms, we still can find $x_1,x_2,\dots,x_T$, which are orthogonal with each other. Thus \begin{equation} S_{d,d} = \sum_{i=1}^d \|A_{i-1}^{-1} x_i\|_2 = d. \end{equation}

For the rest terms, we have to choose $x_1,\dots, x_d$ repeatedly. For example, for $i=d+1$, we have \begin{equation} A_d^{-1} = I - \frac{1}{2} x_1x_1^T - \dots - \frac{1}{2} x_d x_d^T. \end{equation} The eigenvectors of $A_d^{-1}$ are $x_1,\dots,x_d$, and their corresponding eigenvalues are all $\frac{1}{2}$. Wlog, we can let $x_{d+1}$ be $x_1$, and in this sense \begin{equation} \|A_{d}^{-1}x_{d+1}\| = \frac{1}{2}. \end{equation} And \begin{equation} A_{d+1}^{-1} = I - \frac{2}{3} x_1x_1^T - \frac{1}{2}x_2x_2^T - \dots - \frac{1}{2}x_dx_d^T. \end{equation} The rest can be done in the same manner.

(3) When $2d < T <= 3d$, ...

In conclusion, suppose $T$ divided by $d$ has a quotient of $a$ and a remainder of $b$. \begin{equation} S_{d, T} = \sum_{k=1}^a \frac{d}{k} + \frac{b}{a+1}. \end{equation}