Existence of a sequence of vectors in recurrence relations

71 Views Asked by At

For each integer $i \geq 1$, let a fixed real matrix $A_i$ of size $n_i\times n_{i+1}$ be given such that it does not have a row filled entirely with zeros. The family of matrices $\{A_i:i\in\mathbb{N}\}$ satisfy the following property:

for every integer $m \geq 1$, there exist nonzero vectors $s_j\in\mathbb{R}^{n_j}$ such that $s_j=A_js_{j+1}$ for all $j\in\{1,\dots,m\}$.

Note that with every choice of the integer $m$, the collection of vectors $s_1,\dots,s_m$ might be redefined. The following question asks whether we can avoid redefining these vectors.

Question: Do there necessarily exist an (infinite) sequence of nonzero vectors $(s_i)_{i\in\mathbb{N}}$ such that $s_i\in\mathbb{R}^{n_i}$ and $s_i=A_is_{i+1}$ for every ${i\in\mathbb{N}}$?


Simpler versions of this question have been answered first here and then here.

1

There are 1 best solutions below

5
On BEST ANSWER

For a given start index $i,$ we define $R(i)$ as the value to which the rank of the product $A_i A_{i+1} A_{i+2}\ldots$ finally drops, if we keep on multiplying the matrices $A_j,\;j\geq i$ $$ R(i) = \min_{j,\,j\geq i} \mathrm{rank} \left( \prod_{k=i}^{j} A_k \right) $$ Note that $R(i)>0\;\forall i\in\mathbb{N}.$ If $R(i)$ was $0$, we would have a product of matrices $A_{i}A_{i+1}\ldots A_j$ which would be the null matrix, which would contradict the premises of the question.

Obviously, $R(i)$ is monotonically increasing (though not necessarily strictly monotonically increasing). This can be seen as follows: \begin{eqnarray} R(i) & = & \min_{j,\,j\geq i} \mathrm{rank} \left( \prod_{k=i}^{j} A_k \right) \\ & \leq & \min_{j,\,j\geq i+1} \mathrm{rank} \left( \prod_{k=i}^{j} A_k \right) \\ & = & \min_{j,\,j\geq i+1} \mathrm{rank} \left( A_i \prod_{k=i+1}^{j} A_k \right) \\ & \leq & \min_{j,\,j\geq i+1} \mathrm{rank} \left( \;\;\;\; \prod_{k=i+1}^{j} A_k \right) \;\;\;\; = R(i+1) \end{eqnarray} Now we recursively define a strictly monotonically increasing sequence of indices $(i_k)_{k=0}^{\infty}$ such that $i_0=0$ and $$ i_k = \min \left\{ m\in\mathbb{N} \;\; \Big| \;\; m > i_{k-1} \;\; \wedge \;\; \mathrm{rank}\left(\prod_{i=i_{k-1}+1}^{m} A_i \right) = R(i_{k-1}+1) \right\} $$ In order to show that this is well-defined, we have to show that the set contains at least one element. But this is obvious: By the definition of $R$, the rank of the product must sooner or later attain the value $R(i_{k-1}+1),$ if only we choose $m$ large enough.

Now we set $$ B_k = \prod_{i=i_{k-1}+1}^{i_k} A_i $$ We have $$ \mathrm{rank}\left(B_k \right) = R(i_{k-1}+1) \leq \mathrm{rank} \left( \prod_{i=i_{k-1}+1}^{i_{k+1}} A_i \right) = \mathrm{rank}\left(B_k B_{k+1} \right) \leq \mathrm{rank}\left(B_k \right) $$ Therefore, $\mathrm{rank}\left(B_k B_{k+1} \right) =\mathrm{rank}\left(B_k \right).$

We set $$ r_k = \mathrm{rank}\left(B_k \right) = R(i_{k-1}+1) $$ We have shown above, that $R$ is increasing. Therefore, $r_k$ is increasing, too.

Let $B_k= U_kV_k^T$ with an $\left(n_{i_{k-1}+1} \times r_k\right)$ matrix $U_k$ and an $\left(n_{i_k+1} \times r_k\right)$ matrix $V_k.$

$U_k$ and $V_k$ have rank $r_k$, both.

Now we take a look at the matrices $V^T_kU_{k+1}$. We have $$ r_k = \mathrm{rank}\left(B_k B_{k+1}\right) = \mathrm{rank}\left(U_k V_k^T U_{k+1} V^T_{k+1}\right) \leq \mathrm{rank}\left(V_k^T U_{k+1} \right) \leq \mathrm{rank}\left(V_k^T\right) = r_k $$ Therefore, $\mathrm{rank}\left(V_k^T U_{k+1} \right) =r_k.$

$V_k^T U_{k+1}$ is a $r_k \times r_{k+1}$ matrix with rank $r_k$ and $r_{k}\leq r_{k+1}$ This means that it has a right inverse $C_k$ with $(V_k^T U_{k+1})\,C_k = I.$

This means that $$ w_{k+1} = C_k w_k \;\; \text{for} \;\;w_k\in\mathbb{R}^{r_k} \;\; \text{and} \;\;w_{k+1}\in\mathbb{R}^{r_{k+1}} $$ will definitely imply (by multiplication from the left with $V_k^T U_{k+1}$) $$ w_k = V_k^T U_{k+1} w_{k+1}\; . $$ Now we set $w_1 = e_1 \in \mathbb{R}^{r_1}$ (or any other non-zero $r_1$-dimensional vector) and $$ w_{k+1} = C_k w_k $$ We can choose $$ s_{i_{k-1}+1} = U_k w_k $$ The remaining $s_i$ can be calculated by using $s_i = A_i s_{i+1}.$ It is easy to show that this is a consistent choice.

Intuitively, instead of calculating the vectors $s_i,$ which are - in a certain sense - associated with the gaps in the following: $$ U_1V_1^T\;\;\;U_2V_2^T\;\;\;U_3V_3^T\;\;\;U_4V_4^T\;\;\;U_5V_5^T\;\;\;\ldots $$ we calculate the vectors $w_k,$ which are associated with the gaps in $$ U_1\;\;\;V_1^TU_2\;\;\;V_2^TU_3\;\;\;V_3^TU_4\;\;\;V_4^TU_5\;\;\;V_5^T\;\;\;\ldots $$ The tricky part is to chop the sequence $(A_i)_{i=1}^{\infty}$ into portions that ensure that each of the resulting matrices $V_k^TU_{k+1}$ has a right inverse.