Looking for a particular example with matrices

36 Views Asked by At

For given dimensions $r$ and $n$, I want a sequence of sets of matrices, $\{ A_i \in \mathbb{R}^{r \times n} \mid i = 1,\ldots, w \}$, s.t

  • $\frac{1}{w} \sum_{k=1}^w \Vert A_k \Vert $ (the average spectral norm of each set) is a non-decreasing function of $w$
  • $\frac{1}{w} \sum_{k=1}^w A_k $ (the average matrix of each set) is (nearly) constant with increasing $w$

Can someone show such an example? (or show that this is not possible)

1

There are 1 best solutions below

4
On

Let $X_w = \frac{1}{w} \sum_{k=1}^w A_k$.

Note that the sequence $X_w$ satisfies the relation $$ X_{w+1} = \frac{1}{w+1}(A_{w+1} + wX_w). $$ Suppose that $X_w$ is (eventually) constant, i.e. $X_w = X$ for some fixed $X$ and all $w$. Since $X_w = X_{w+1}$, we can rewrite and solve the above equation as follows: $$ X_{w} = \frac{1}{w+1}(A_{w+1} + wX_w) \implies\\ (w+1)X_w = A_{w+1} + wX_w \implies\\ A_{w+1} = X_w = X. $$ In other words: if the sequence $X_w$ is constant, then the sequence $A_k$ must also be constant.