Given vectors $x_1,\dots,x_T \in R^d$ satisfying $\|x_i\|_2 = 1$, define $A_0 = I$ and $A_t = I + \sum_{i=1}^tx_ix_i^\top$ for $t \geq 1$. We are interested in the following quantity: \begin{align} S_{d,T} = \sum_{i=1}^T\|A_{i-1}^{-1}x_i\|_2. \end{align} If $d = 1$, it is easy to bound $S_{1,T}$ by $\log T$ as follows: \begin{align} S_{1,T} = \sum_{i=1}^T\frac{1}{i} = O(\log T). \end{align} Is it true that for $d\geq 2$, we still have the similar bound $S_{d,T} \leq O(poly(d)\log T)$, where $poly(d)$ represents some polynomial of $d$?
2026-03-28 15:36:44.1774712204
Bound for the inverse of a summation of rank-1 matrices
103 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in MATRIX-CALCULUS
- How to compute derivative with respect to a matrix?
- Definition of matrix valued smooth function
- Is it possible in this case to calculate the derivative with matrix notation?
- Monoid but not a group
- Can it be proved that non-symmetric matrix $A$ will always have real eigen values?.
- Gradient of transpose of a vector.
- Gradient of integral of vector norm
- Real eigenvalues of a non-symmetric matrix $A$ ?.
- How to differentiate sum of matrix multiplication?
- Derivative of $\log(\det(X+X^T)/2 )$ with respect to $X$
Related Questions in MATRIX-DECOMPOSITION
- Real eigenvalues of a non-symmetric matrix $A$ ?.
- Swapping row $n$ with row $m$ by using permutation matrix
- Block diagonalizing a Hermitian matrix
- $A \in M_n$ is reducible if and only if there is a permutation $i_1, ... , i_n$ of $1,... , n$
- Simplify $x^TA(AA^T+I)^{-1}A^Tx$
- Diagonalize real symmetric matrix
- How to solve for $L$ in $X = LL^T$?
- Q of the QR decomposition is an upper Hessenberg matrix
- Question involving orthogonal matrix and congruent matrices $P^{t}AP=I$
- Singular values by QR decomposition
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I'm not sure my answer is correct.
(1) When $T <= d$, we have \begin{equation} S_{d,T} = \sum_{i=1}^T 1= T. \end{equation} Here we need $x_1,x_2,\dots,x_T$ are orthogonal with each other.
It is obvious that $\|A_0^{-1}x_1\| = \|x_1\| = 1$. According to Sherman–Morrison formula, we have \begin{equation} A_1^{-1} = I - \frac{1}{2}x_1 x_1^T. \end{equation} Thus, based on the definition of matrix norm, we have \begin{equation} \|A_1^{-1}x_2\| \leq \|A_1^{-1}\|_2, \end{equation} Due to $d >=T$, we can found an eigenvector which is orthogonal to $x_1$, and its corresponding eigenvalue is of course $1$, the max eigenvalue of $A$. Thus let $x_2^T x_1 = 0$, the equality in the above equation will be held, i.e., \begin{equation} \|A_1^{-1}x_2\| = 1. \end{equation} Next, for these certain $x_1, x_2$, we can find that \begin{equation} A_2^{-1} = I - \frac{1}{2} x_1 x_1^T - \frac{1}{2} x_2x_2^T. \end{equation} Again, we have \begin{equation} \|A_2^{-1}x_3\| \leq \|A_2^{-1}\|_2. \end{equation} And we can find an eigenvector, which is orthogonal to $x_1$ and $x_2$ (due to $d \ge T$), the equality will be held when setting $x_3$ as this eigenvector: \begin{equation} \|A_2^{-1}x_3\| = 1. \end{equation} The rest can be done in the same manner.
(2) When $d < T <= 2d$ , we have \begin{equation} S_{d,T} = d + \frac{T-d}{2} = \frac{T+d}{2} \end{equation}
For the first $d$ terms, we still can find $x_1,x_2,\dots,x_T$, which are orthogonal with each other. Thus \begin{equation} S_{d,d} = \sum_{i=1}^d \|A_{i-1}^{-1} x_i\|_2 = d. \end{equation}
For the rest terms, we have to choose $x_1,\dots, x_d$ repeatedly. For example, for $i=d+1$, we have \begin{equation} A_d^{-1} = I - \frac{1}{2} x_1x_1^T - \dots - \frac{1}{2} x_d x_d^T. \end{equation} The eigenvectors of $A_d^{-1}$ are $x_1,\dots,x_d$, and their corresponding eigenvalues are all $\frac{1}{2}$. Wlog, we can let $x_{d+1}$ be $x_1$, and in this sense \begin{equation} \|A_{d}^{-1}x_{d+1}\| = \frac{1}{2}. \end{equation} And \begin{equation} A_{d+1}^{-1} = I - \frac{2}{3} x_1x_1^T - \frac{1}{2}x_2x_2^T - \dots - \frac{1}{2}x_dx_d^T. \end{equation} The rest can be done in the same manner.
(3) When $2d < T <= 3d$, ...
In conclusion, suppose $T$ divided by $d$ has a quotient of $a$ and a remainder of $b$. \begin{equation} S_{d, T} = \sum_{k=1}^a \frac{d}{k} + \frac{b}{a+1}. \end{equation}