Find which columns of a matrix are linear combination of the others

556 Views Asked by At

Consider the example where I have a matrix $\mathbf{D}$ in $-1/1$ coding with $5$ columns,

$$D = \begin{bmatrix}-1&-1&-1&1&1\\1&-1&-1&-1&1\\-1&1&-1&-1&-1\\1&1&-1&1&-1\\ -1&-1&1&1&-1\\1&-1&1&-1&-1\\-1&1&1&-1&1\\1&1&1&1&1\end{bmatrix}$$

We see that the fourth and fifth columns are combinations of the first three columns so that if we label the columns $a,b,c,d,e$ we can say that $d=a*b$ and $e=b*c$. We can separate the columns in two groups: $a,b,c$ are basic columns and $e,d$ are added columns. Furthermore we can call the relations that define $e$ and $d$ as a combination of $a,b,c$ defining contrasts.

My question is this: Is there a way to determine more generally (for a larger matrix or a matrix with different defining contrasts) which columns are combinations of the others and

3

There are 3 best solutions below

0
On BEST ANSWER

There is a mutuality in linear dependence.

If a vector $\bf a$ is linearly dependent of $\bf b$ and $\bf c$, then $\bf b$ is linearly dependent on $\bf a$ and $\bf c$ and vice versa.

This is no stranger than we can re-write $${\bf v_0} = \sum_{\forall i \neq 0} c_i {\bf v_i}$$ into $${\bf v_k} = \frac{1}{c_k}\sum_{\forall i \neq k} c_i {\bf v_i}$$

The linear weights just get scaled by $c_k$.

So what we can do now is to build a linear equation system removing one of the columns as the data $\bf d$, (we modify $\bf D$ by removing this column), and the rest as regression functions and solve with a classical linear least squares:

$${\bf x_o = }\min_{\bf x}\|{\bf D x - d}\|_2^2$$

If there is a perfect fit, then $\bf d$ is linearly dependent on at least some of the vectors in $\bf D$. If not, then still some of the vectors in $\bf D$ may be linearly dependent of each other.

0
On

At the comments @Rodrigo de Azevedo said you can apply Gaussian elimination which is quite intuitive method to apply but I personally suggest that if you want to find the linear dependence relation between rows or columns of a square matrix, taking the determinant of the matrix then if it is 0 then you may want to apply gaussian elimination. It will reduce the work.

0
On

For an $m \times n$ matrix $A$ let $A'$ be the unique row-reduced echelon matrixx that is row-equivalent to $A.$ The relations of linear dependence among the columns of $A$ are the same as those among the columns of $A'.$ The columns of $A'$ in which a leading 1 appears are a basis for the column-space of $A'$ and the columns of $A$ in the same positions are a basis for the column-space of $A.$Let the $t-$th column of $A$ be $\zeta_t$ and let the $t-$th column of $A'$ be $\xi_t$ for $1 \le t \le n.$For a column $\zeta $ of $A$ let $\xi$ be the column of of $A'$ in the same position, let the element in row $i$ of $\xi$ be $c_i$ and, if $c_i \ne 0$ let the leading 1 in row $i$ of $A'$ occur in column $\nu_i$ of $A'$ for $1 \le i \le m.$ If $c_i=0,$ let $\zeta_{\nu_i}=\zeta_{\nu_i}=\mathbf 0.$ Then $$\xi=\sum_{i=1}^mc_i\xi_{\nu_i}$$ and $$\zeta=\sum_{i=1}^mc_i\zeta_{\nu_i}$$.