$V=\left \{ C|\space C\in M_{q\times r}(R)\space such \space that\space ACB=O \right \} $, find the dimension of the vector space V

213 Views Asked by At

Let $A$ be a $p\times q$ matrix of rank $\alpha$ and $B$ a $r\times s$ matrix of rank $\beta$. Let $V=\left \{ C|\space C\in M_{q\times r}(\mathbb{R}) \text{ and } ACB=0_{p\times s} \right \} $, find the dimension of the vector space V.

Can I solve this by the following......

(1)the range space of B is the domain of C, where rank(B)=$\beta$

(2)the range space of C covers the whole null space of A, where nullity(A)=q-rank(A)=q-$\alpha$

Hence the dimension of C need not to be q$\times r$, it only takes (q-$\alpha)\times \beta$ in necessary.

please give some opinion please!

3

There are 3 best solutions below

4
On BEST ANSWER

Let $f : \mathbb{R}^q \rightarrow \mathbb{R}^p$, and $g : \mathbb{R}^s \rightarrow \mathbb{R}^r$ be the linear transformations canonically associated to $A$ and $B$ respectively.

The set $W=\lbrace h \in \mathcal{L}(\mathbb{R}^r,\mathbb{R}^q) \mid f \circ h \circ g= 0 \rbrace$ is clearly isomorphic to $V$, so we shall find the dimension of $W$.

Let $\mathcal{B}=(e_1, ..., e_r)$ be a basis or $\mathbb{R}^r$ such that $(e_1, ..., e_\beta)$ is a basis of $\mathrm{Im}(g)$.

Then for $h \in \mathcal{L}(\mathbb{R}^r,\mathbb{R}^q)$, one has the following \begin{align*} h \in W & \Longleftrightarrow h(\mathrm{Im}(g)) \subset \mathrm{Ker}(f) \\ & \Longleftrightarrow h(e_i) \in \mathrm{Ker}(f) \text{ for every } i=1, ..., \beta \\ & \Longleftrightarrow (h(e_1), ..., h(e_n)) \in \mathrm{Ker}(f)^\beta \times (\mathbb{R}^q)^{r-\beta} \end{align*}

Since the map $\varphi : \mathcal{L}(\mathbb{R}^r,\mathbb{R}^q) \rightarrow (\mathbb{R}^q)^r$ defined by $\varphi(h)=(h(e_1), ..., h(e_n))$ is an isomorphism, one deduces that $$\dim(W)=\dim(\mathrm{Ker}(f)^\beta \times (\mathbb{R}^q)^{r-\beta})=\beta \times\dim(\mathrm{Ker}(f)) + (r-\beta)q$$

i.e., since $\dim(\mathrm{Ker}(f))=q-\alpha$, one has $\dim(W)=\beta(q-\alpha) + (r-\beta)q$, i.e.

$$\boxed{\dim(W)=rq-\alpha\beta}$$

1
On

I will describe my (almost) complete thought process that led me to a solution. $\DeclareMathOperator{\im}{Im}\\\DeclareMathOperator{\ker}{Ker}$ I find it helpful to "draw" the chain of operators involved in the condition imposed on your $C$'s like this:

$$\forall v\in\mathbb{R}^{s}: v\overset{B}{\to}Bv\overset{C}{\to}CBv\overset{A}{\to}ACBv=0$$

You're looking at such $C$'s that every $v$ going through the multiple stages ends up the zero vector. And this is the "going forward" point of view.

It is now natural (at least to me) to look at what happens if we go over this chain backwards:

  1. Starting from the zero vector, it has been produced by $A$, hence $CBv\in\ker A$.
  2. Going back, $CBv$ has been produced by $C$. But $C$ is arbitrary, in the sense that it is not "given" in the problem statement, so we can't say much at this point about this part of the chain.
  3. We're at the last piece of the chain now, $v\overset{B}{\to}Bv$. Here, $B$ is fixed while $v$ is "floating" around all of $\mathbb R^s$. What we can say here is that (again, by definition) $Bv \in \im B$.

What I said in all these words so far is near trivial: the $C$'s in $V$ are obliged to send $\im B$ within $\ker A$: $$\im B\overset{C}{\to}\ker A.$$

Even more, this is a sufficient property. That is, $$\forall C: ACB=0\Leftrightarrow C(\im B)\subseteq\ker A.$$

In other words, we're looking at all the $C$'s whose restriction on $\im B$ is a linear map $\im B\to\ker A$: $$V=\left\{ C:C\vert_{\im B}:\hom(\im B, \ker A)\right\}$$ Now note that outside of $\im B$, we have no restrictions on $C$ - $C$ is "free" to send the "rest" of $\mathbb{R}^{r}$ (any vector space complement, say, $\im B^\perp$) in any way it wants to $\mathbb R^q$. Hence $$\dim V= \dim\hom\left(\im B^\perp,\mathbb{R}^{q}\right) + \dim\hom(\im B,\ker A).$$

0
On

It is not hard to check that $V$ is a vector space.

Since $A$ is a $p \times q$ matrix of rank $\alpha$, an invertible $p \times p$ matrix $F_1$ and an invertible $q \times q$ matrix $F_2$ exist such that $$ A = F_1 A_0 F_2, $$ in which $$ A_0 = \begin{bmatrix} I_{\alpha} & 0_{\alpha \times (q - \alpha)} \\ 0_{(p - \alpha) \times \alpha} & 0_{(p - \alpha) \times (q - \alpha)} \\ \end{bmatrix}. $$

Since $B$ is an $r \times s$ matrix of rank $\beta$, an invertible $r \times r$ matrix $G_1$ and an invertible $s \times s$ matrix $G_2$ exist such that $$ B = G_1 B_0 G_2, $$ in which $$ B_0 = \begin{bmatrix} I_{\beta} & 0_{\beta \times (s - \beta)} \\ 0_{(r - \beta) \times \beta} & 0_{(r - \beta) \times (s - \beta)} \\ \end{bmatrix}. $$

Hence $ACB = 0$ becomes $$ F_1 A_0 (F_2 C G_1) B_0 G_2 = 0_{p \times s}, $$ which means $$ A_0 (F_2 C G_1) B_0 = 0_{p \times s}. $$ Denote the $q \times r$ matrix $F_2 C G_1$ by $X$. It is not hard to find that $A_0 X B_0 = 0$ if and only if $$ [X]_{i,j} =0 \qquad \text{for $i \leq \alpha$ and $j \leq \beta$}, $$ in which $[X]_{i,j}$ is the $(i,j)$-entry of $X$.

Let $E_{u,v}$ be the $q \times r$ matrix with the property that $$ [E_{u,v}]_{i,j} = \begin{cases} 1, & \text{$u = i$ and $v = j$}; \\ 0, & \text{else}. \end{cases} $$

Let $J$ be the set $$ \{ F_2^{-1} E_{u,v} G_1^{-1} \mid \text{$u > \alpha$ or $v > \beta$} \}. $$

(1) Every solution to $ACB = 0$ is some linear combination of the members of $J$. If $A C_0 B = 0$, then $A_0 (F_2 C_0 G_1) B_0 = 0$, which means $$ [F_2 C_0 G_1]_{i,j} =0 \qquad \text{for $i \leq \alpha$ and $j \leq \beta$}. $$ Hence $$ \begin{aligned} F_2 C_0 G_1 = {} &\sum_{\substack{ 1 \leq i \leq q \\ 1 \leq j \leq r \\ }} {[F_2 C_0 G_1]_{i,j} E_{i,j}} \\ = {} & \sum_{\substack{ 1 \leq i \leq q \\ 1 \leq j \leq r \\ i \leq \alpha \,\text{and}\, j \leq \beta }} {[F_2 C_0 G_1]_{i,j} E_{i,j}} + \sum_{\substack{ 1 \leq i \leq q \\ 1 \leq j \leq r \\ i > \alpha \,\text{or}\, j > \beta }} {[F_2 C_0 G_1]_{i,j} E_{i,j}} \\ = {} & \sum_{\substack{ 1 \leq i \leq q \\ 1 \leq j \leq r \\ i > \alpha \,\text{or}\, j > \beta }} {[F_2 C_0 G_1]_{i,j} E_{i,j}}, \end{aligned} $$ which means $$ C_0 = F_2^{-1} (F_2 C_0 G_1) G_1^{-1} = \sum_{\substack{ 1 \leq i \leq q \\ 1 \leq j \leq r \\ i > \alpha \,\text{or}\, j > \beta }} {[F_2 C_0 G_1]_{i,j} (F_2^{-1} E_{i,j} G_1^{-1})}. $$

(2) The members of $J$ are linearly independent. Suppose that $$ \sum_{\substack{ 1 \leq i \leq q \\ 1 \leq j \leq r \\ i > \alpha \,\text{or}\, j > \beta }} {k_{i,j} (F_2^{-1} E_{i,j} G_1^{-1})} = 0. $$ Hence $$ F_2 \left( \sum_{\substack{ 1 \leq i \leq q \\ 1 \leq j \leq r \\ i > \alpha \,\text{or}\, j > \beta }} {k_{i,j} (F_2^{-1} E_{i,j} G_1^{-1})} \right) G_1 = 0. $$ Hence $$ \sum_{\substack{ 1 \leq i \leq q \\ 1 \leq j \leq r \\ i > \alpha \,\text{or}\, j > \beta }} {k_{i,j} E_{i,j}} = 0. $$ Hence $k_{i,j} = 0$.

We have shown that the members of $J$ form a basis of $V$.

$J$ has $rq - \alpha \beta$ members, so the dimension of $V$ is the number of the members of $J$, $rq - \alpha \beta$.