Rigorous definition of linear dependence

109 Views Asked by At

I'm beggining to learn linear algebra, where I come up to this definition of linear dependence in my book:

Let $M_1, M_2, ..., M_k$ be $p×n$ matrices, they are said to be linearly dependent iff the null matrix is a linear combination of them, without all of the coeficients being 0.

So I wanted to write this in rigorous notation without those "..." and english words etc. But I'm not sure if it is correct or if a different notation is used so I'd like to ask you to correct it.. So here's it:

$\{M_k\}_{k\in K}$ is linearly dependent$ \iff \exists \{\alpha_k\}_{k\in K} : \sum_{k\in K} \alpha_k M_k =0 \land \exists x\in K : \alpha_x \neq 0. $

(Eg. $\{M_k\}_{k\in K}$ is supposed to represent an arbitrary set of matrices indexed by K).

2

There are 2 best solutions below

0
On BEST ANSWER

How you formalized it is completely correct and sufficient. I however want to emphasize that over-formalization is not helpful for the reader. Most of the time, the first version (the one with the "...") is far more convenient and easier to understand. The "..." do not really pose a problem for the rigor.


Anyway, I want to present you another version for how to write it using the "function set notation" to avoid the "...":

$$A^B:=\{f\mid f:B\to A\}.$$

This is the standard interpretation of the notation $A^B$, maybe you are already familiar with it. For $x\in A^B$ and $b\in B$, one writes $x_b$ instead of $x(b)$ to emphasize the tuple nature of the elements of $A^B$ instead of their function nature. You know it from notations like $\Bbb R^3$ which is just a short form of $\Bbb R^{\{1,2,3\}}$, and for $x\in\Bbb R^3$ we also just write $x_1,x_2,x_3$ for the components instead of $x(1),x(2),x(3)$. Just view it as a neat way to denote sequences over arbitrary index sets:

$$\{x_k\}_{k\in B} \text{ with $x_k\in A$}\qquad\Longleftrightarrow\qquad x\in A^B.$$

Further, note that if $A$ is a vector space, then so is $A^B$ by using component-wise operations. E.g. $x,y\in\Bbb R^B$ can be added component-wise to $(x+y)_b=x_b+y_b$.

Now my actual answer:

Let $\mathcal M$ be the set of all $p\times n$ matrices for some $p,n\in\Bbb N$, and $K$ be a finite set. Then $M\in \mathcal M^K$ (a sequence of matrices) is said to be linearly dependent if and only if $$\exists \alpha\in\Bbb R^K\setminus\{0\}:\sum_{k\in K} \alpha_k\mathcal M_k=0.$$

6
On

If you want to be as clear as possible, this is what I would recommend.

Let $M_1, M_2, ..., M_k$ be $p×n$ matrices (where $k$, $p$, and $n$ are some integers). The collection $\{M_i\}_{1 \leq i \leq k}$ (where $i$ is an integer) is said to be linearly dependent iff $$ \exists \{\alpha_i\}_{1 \leq i \leq k} (\text{where each $\alpha_i$ is a real number})\text{ such that } \sum_{1 \leq i \leq k} \alpha_i M_i =0 \text{ and } \exists i \text { such that } \alpha_i \neq 0. $$