Almost sure linear independence of random vectors

569 Views Asked by At

Draw $m$ vectors of $n$ entries at random, $m \leq n$, all the entries are i.i.d. Gaussian random variables with mean zero and unit variance. How to show that these $m$ vectors are linearly independent with probability one?

1

There are 1 best solutions below

0
On BEST ANSWER

Say $m \leq n$ and we draw $m$ vectors in $\mathbb{R}^n$ according to any distribution which is absolutely continuous with respect to the Lebesgue measure on $\mathbb{R}^n$. We will inductively on $m$ show that the probability these vectors are linearly dependent is zero.

The probability that the first vector is the zero vector is obviously zero. Now suppose the result true for $m-1$. Then the probability that $m$ vectors are linearly dependent is equal to the probability that the $m$th is a linear combination of the first $m-1$ (all the other failures occur with probability zero by the inductive hypothesis). The set of vectors in the span of the first $m-1$ vectors has Lebesgue measure zero, so you are done.

(Induction is a bit of a crutch here: the idea of the proof is that any proper subspace has Lebesgue measure zero, so the vectors should fall into it with zero probability, but we have to get around the fact that there are infinitely many proper subspaces. A prettier proof would use Lebesgue measure on a Grassmannian.)