How can same number of linearly independent vectors exist in smaller power nullspaces of a matrix?

55 Views Asked by At

Suppose I have a $n \times n$ matrix $A$. Let the dimension of null-space of $A^k$(denoted by $\mathcal{N}_{A^k}$) be $2$. This implies there can be at most $2$ linearly independent vectors $u,v$ in $\mathcal{N}_{A^k}$.

Let $\mathcal{W} = \mathbb{L}[u] + \mathcal{N}_{A^{(k-1)}}$, where $\mathbb{L}[S]$ denotes linear combinations of vector set $S$. Also, let $v \not \in \mathcal{W}$. Using the special way in which $\mathcal{W}$ was constructed, it can be shown that $Au,Av$ are linearly independent in $\mathcal{N}_{A^{(k-1)}}$(proof below). This reasoning can be extended to smaller powers of $A$ to obtain $2$ linearly independent vectors at each level.

However, the dimensions of null-spaces of decreasing powers of $A$ are non-increasing. In our particular case, there might exist $\mathcal{N}_{A^j}, 1 \le j < k$ whose dimension is $1$. This implies only $1$ linearly independent vector can exist. But our construction above supplies $2$ linearly independent vectors for each smaller power null-space.

What is the flaw in reasoning here ?


Proof for linear independence of $Au,Av$:

\begin{align} \alpha A u + \beta A v = \theta_n \\ \Rightarrow A (\alpha u + \beta v) = \theta_n \\ \Rightarrow (\alpha u + \beta v) \in \mathcal{N}_A \\ \Rightarrow (\alpha u + \beta v) \in \mathcal{N}_A^{(k-1)} \\ \Rightarrow (\alpha u + \beta v) = x \hspace{2mm} \mbox{where} \hspace{2mm} x \in \mathcal{N}_A^{(k-1)} \\ \Rightarrow \beta v = x + (-\alpha u) \\ \Rightarrow \beta v \in \mathbb{L}[u] + x = \mathcal{W} \\ \Rightarrow \beta = 0 \hspace{2mm} \mbox{,otherwise $v$ will be in $\mathcal{W}$} \\ \Rightarrow \alpha u = 0 \\ \Rightarrow \alpha = 0 \hspace{2mm} \mbox{since $u$ is a non-zero vector} \end{align}

Therefore, $Au,Av$ are linearly independent.