matrix analysis and rank of matrix

175 Views Asked by At

I'm trying to prove that if $V(A) \cap N(A)=0$ , where A is a $n \times n$ matrix and $V(A)$ denote the range (column space) and $N(A)$ is the kernel (null space) of $A$, then there exists a non-singular matrix $B$ such that $A^2=BA$?

2

There are 2 best solutions below

3
On

By Your assumptions You have a direct sum decomposition $\mathbb{R}^n=V(A)\oplus N(A)$, i.e. for every $x\in\mathbb{R}^n$ there are unique $x_V,x_N$ in $V(A),N(A)$ respectively such that $x=x_V+x_N$.

Now define the linear map $B$ by $Bx=B(x_V+x_N)=Ax_V+x_N$. Then

$BAx=BA(x_V+x_N)\underbrace{=}_{x_N\in N(A)}BAx_V\underbrace{=}_{Ax_V\in V(A)}AAx_V=A^2x_V \underbrace {=}_{x_N\in N(A^2)}A^2(x_V+x_N)=A^2x$

for all $x\in \mathbb{R}^n$.

And $Bx=Ax_V+x_N=0$ implies $x_N=A(-x_V)\in N(A)\cap V(A)$ so $x_N=0$ and therefore $Ax_V=0$, i.e. $x_V\in V(A)\cap N(A)$, which means $x_V=0$.

Altogether it follows $x=0$, thus $B$ is non-singular.

7
On

Method 1: somehow similar to @PeterMelech. Set $$ B=A\cdot AA^++(I-AA^+) $$ where $A^+$ is the Moore-Penrose pseudoinverse.

  1. $BA=A^2$ follows immediately from the definition of $A^+$.
  2. $Bx=0$ implies $x=0$ as

$$ Bx=0\ \Rightarrow \ \underbrace{A\cdot AA^+x+(I-AA^+)x}_{\text{orthogonal vectors}}=0\ \Rightarrow \begin{cases} A\cdot AA^+x=0,\\ (I-AA^+)x=0, \end{cases}\Rightarrow \begin{cases} \underbrace{AA^+x}_{\in V\cap N}=0,\\ (I-AA^+)x=0, \end{cases} \ \Rightarrow \ x=0. $$


Method 2: via Jordan normal form.

Lemma: $\text{Im}(A)\cap\ker(A)=\{0\}$ $\Rightarrow$ $\ker(A)=\ker(A^2)$.

Proof: $\ker(A)\subset\ker(A^2)$ always and $\ker(A^2)\subset\ker(A)$ by $$ A^2x=0\quad\Rightarrow\quad Ax\in\ker(A)\cap\text{Im}(A)=\{0\}.\qquad\square $$ Consider two cases:

  1. If $A$ is nonsingular then $B=A$, done.
  2. If $A$ is singular. It is equivalent to proving the statement for $J_A$ = the Jordan form of $A$. Since by Lemma $\ker(A^2)=\ker(A)$, the Jordan blocks of $A$ corresponding to $\lambda=0$ are diagonal, i.e $$ J_A=\left[\matrix{J_1 & 0\\0 & 0}\right] $$ where $J_1$ is nonsingular (corresponds to nonzero eigenvalues of $A$.) Then to get $J_A^2=BJ_A$ we take $$ B=\left[\matrix{J_1 & 0\\0 & I}\right]. $$

Method 3: apply the following lemma to $L=(A^2)^T$ and $M=A^T$.

Lemma: Given $m\times n$ matrices $L$ and $M$ $$ \text{Im}(L)=\text{Im}(M)\qquad\Leftrightarrow\qquad\exists\text{ nonsingular } B: L=MB. $$ Proof: $\Leftarrow$ trivial.

$\Rightarrow$ Permuting if necessary the columns of $L$ and $M$ we can think that the first $r=\text{rank}(L)=\text{rank}(M)$ columns (the blocks $L_1$ and $M_1$ below) are the basis of the image, i.e. $$ LP_1=\tilde L=[L_1\ L_2],\qquad MP_2=\tilde M=[M_1\ M_2]. $$ By the assumption we have that the columns of $L_1$, $L_2$ and $M_2$ are linear combinations of the columns of $M_1$, i.e.

  1. $\exists$ $r\times r$ nonsingular $B_1$ such that $L_1=M_1B_1$.
  2. $\exists$ $r\times (n-r)$ matrix $B_2$ such that $L_2=M_1B_2$.
  3. $\exists$ $r\times (n-r)$ matrix $B_3$ such that $M_2=M_1B_3$ $\Rightarrow$ $[M_1\ M_2]\left[\matrix{-B_3\\I}\right]=0$.

Combining all together we get $$ [L_1\ L_2]=[M_1\ M_2]\left[\matrix{B_1 & B_2\\0 & 0}\right]= [M_1\ M_2]\left[\matrix{B_1 & B_2\\0 & 0}\right]+ \underbrace{[M_1\ M_2]\left[\matrix{0 & -B_3\\0 & I}\right]}_{=0}= [M_1\ M_2]\underbrace{\left[\matrix{B_1 & *\\0 & I}\right]}_{\tilde B}. $$ Finally, $B=P_2\tilde B P_1^{-1}$.