If $AX=0 \Rightarrow BX=0$, prove that there is $C$ such that $B=CA$

1.5k Views Asked by At

Let $A,B$ be $3 \times 3$ real matrices such that, for any vector $v$, if $Av=0$ then we also have $Bv=0$.
Prove that there is $C$ such that $B=CA$.

If $A$ is invertible, we can get $C=BA^{-1}$. Now let's say that $A$ is not invertible so $\det A=0$. This means that the system of linear equations $Ax=0$ has a nonzero solution. But this means that also $Bx=0$ has a nonzero solution so $\det B=0$.
Let $v \neq 0$ such that $Av=Bv=0$. Then $(A+B)v=(A-B)v=0$ which gives $\det(A+B)=\det(A-B)=0$. Using $$\det(A+\lambda B)=\det A +p\lambda +q \lambda^2+(\det B)\cdot \lambda^3=p\lambda+q\lambda^2$$ and these results gives $p=q=0$, so $\det(A+\lambda B)=0, \forall \lambda \in \mathbb{C}$. This is where I got stuck. Is there any way to finish from here?

2

There are 2 best solutions below

0
On BEST ANSWER

Actually this is easier for $n\times n$ matrices.

Write $A=[a_1\ a_2\ \dots\ a_n]$ and $B=[b_1\ b_2\ \dots\ b_n]$. You want to find a matrix $C$ such that $Ca_i=b_i$, for $i=1,\dots,n$.

Suppose we have $\alpha_1a_1+\alpha_2a_2+\dots+\alpha_na_n=0$, which amounts to saying that $$ A\begin{bmatrix}\alpha_1 \\ \alpha_2 \\ \vdots \\ \alpha_n \end{bmatrix}=0 $$ By hypothesis, this implies $$ B\begin{bmatrix}\alpha_1 \\ \alpha_2 \\ \vdots \\ \alpha_n \end{bmatrix}=0 $$ Thus any linear relation between the columns of $A$ implies the similar relation between the columns of $B$.

Isolate a maximal linearly independent set $\{a_{i_1},\dots,a_{i_k}\}$ among the columns of $A$ and complete it to a basis $\{a_{i_1},\dots,a_{i_k},v_{k+1},\dots,v_n\}$ of $\mathbb{R}^n$. Then there exists a linear transformation sending $a_{i_j}$ to $b_{i_j}$ (for the columns we extracted above, so $j=1,\dots,k$) and the other basis vectors to, say, $0$ (actually it's irrelevant).

Let $C$ be the matrix of this linear transformation. Then $Ca_{i_j}=b_{i_j}$, for $j=1,\dots,k$. Also, if $a_{r}=\beta_1a_{i_1}+\dots+\beta_ka_{i_k}$, then we know that $$ b_{r}=\beta_1b_{i_1}+\dots+\beta_kb_{i_k} $$ so also $Ca_{r}=b_r$.

Hence $CA=B$.

0
On

If $\operatorname{null}A=\mathbb{R}^3$, the problem is trivial.

If $\dim\operatorname{null}A=2$, let $\{v_1,v_2\}$ be a basis of $\operatorname{null}A$ and let $v_3\in\mathbb{R}^3$ be such that $\{v_1,v_2,v_3\}$ is a basis of $\mathbb{R}^3$. If $B=0$, take $C=0$. Otherwise, $\dim\operatorname{null}B=2$ and there is a $w\in\mathbb{R}^3\setminus\operatorname{null}B$. Take a matrix $C$ such that $C.(A.v_3)=B.w$. Then, since$$C.(A.v_1)=0=B.v_1\text{ and }C.(A.v_2)=0=B.v_2,$$$B=CA$.

The case in which $\dim\operatorname{null}A=1$ is similar and you've already dealt with case in which $\dim\operatorname{null}A=0$.