Can independence of a system and a vector be establish if there is only cross-indepedence?

37 Views Asked by At

Say that I have the following linear system:

$$[A a'] \begin{bmatrix} x \\ x' \\ \end{bmatrix} =Ax + a'x' $$

I want to know when this system is zero if and only if $\begin{bmatrix} x \\ x' \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix}$ .

These are the properties of the system that I do know. I know that all the columns in A are independent. Hence the following holds always:

$$Ax = 0 \iff x =0$$

i.e. the columns of A cannot be expressed as a linear combination of each other.

I also know that (a new vector) $a'$ is independent of some column in A. i.e. if you choose some column (say column $i$ satisfies this property) from A then:

$$a_i x_i +a' x' = 0 \iff \begin{bmatrix} x_i \\ x' \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix}$$

i.e. neither of them can be written as a linear combination of each other.

Let $A_{-i}$ denote the matrix A without the column $a_i$. Similarly, let $x_{-i}$ be the linear combination vector corresponding to the matrix $A_{-i}$. Notice that we can write:

$$Ax = 0 \iff x =0$$

equivalently as:

$$A_{-i}x_{-i} +a_ix_i = 0 \iff \begin{bmatrix} x_{-i} \\ x_{i} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \end{bmatrix} $$

Notice that we can write the original problem as:

$$[A_{-i} \ a_i \ a'] \begin{bmatrix} x_{-i} \\ x_{i} \\ x' \\ \end{bmatrix} = A_{-i}x_{-i} + a_{i}x_{i} + a'x' $$

Obviously that is zero if all the linear combination weights are zero. But I want to show that is the only solution in the Nullspace/Kernel of $[A_{-i} \ a_i \ a'] $. Notice there is a "cross-indepedence" because of the term $a_i$ that is independent on $a'$ and $A_{-i}$.

This is what I have so far, consider the case when $A_{-i}x_{-i} + a_{i}x_{i} + a'x' = 0$ . When that is true then the following is true:

$$A_{-i}x_{-i} + a'x'= -a_{i}x_{i}$$

However, we know that $-a_{i}x_{i}$ cannot be expressed as a linear combination of any of the columns of $A_{-i}$ and also, $a'x'$ cannot express $-a_{i}x_{i}$. Therefore, since neither of them can express $-a_{i}x_{i}$, hence, the equation above only holds if every $x$ is zero, i.e.: $ \begin{bmatrix} x_{-i} \\ x_{i} \\ x' \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \\ \end{bmatrix} $

meaning that indeed my original conjecture was right. Is this proof correct? I was not sure and ia also was not sure if it was rigorous enough.

1

There are 1 best solutions below

0
On

Write $A$ as $A=\begin{bmatrix}a_1&a_2&\cdots a_n\end{bmatrix}$ (assuming $A$ has $n$ columns). Then \begin{equation} \begin{bmatrix} A & a'\end{bmatrix}\begin{bmatrix} x \\ x'\end{bmatrix}=x_1a_1+x_2a_2+\cdots +x_na_n+x'a'\end{equation} where we are assuming $x_1,x_2,\ldots,x_n$ are the components of $x$. Now when is the equation above zero if and only if $x_1,x_2,\ldots,x_n,x'$ are all zero? When the vectors $a_1,a_2,\ldots,a_n,a'$ are linearly independent...

Notice that it is not sufficient that $a'$ is linearly independent only from some $a_i$, it must be linearly independent from each $a_i$, for else there could be $j$ and $c_j,c'\neq0$ so that $c_ja_j+c'a'=0$ and then the vector $z=(0,\ldots,0,c_i,0,\ldots,0,c')^T$ is a nonzero vector for which \begin{equation} \begin{bmatrix} A & a'\end{bmatrix}z=0.\end{equation}