Probability of a random matrix to be invertable.

84 Views Asked by At

Suppose that $x_{ij},i=1,2,\ldots,n;\,j=1,2,\ldots,m$ are independent and identically distributed continuous random variables. What is the probability that the group of vectors $$\left(\sum_{j=1}^mx_{1j},\sum_{j=1}^mx_{2j},\cdots,\sum_{j=1}^mx_{nj}\right)\\\left(\sum_{j=1}^mx_{1j}^2,\sum_{j=1}^mx_{2j}^2,\cdots,\sum_{j=1}^mx_{nj}^2\right)\\\vdots\\\left(\sum_{j=1}^mx_{1j}^n,\sum_{j=1}^mx_{2j}^n,\cdots,\sum_{j=1}^mx_{nj}^n\right)$$ to be linearly independent? Is it 1?

2

There are 2 best solutions below

1
On

The probability is $1$ as can be shown by induction on $n$.

The claim is clear for $n=0$.

If the projections of first $n-1$ vectors into $\mathbb R^{n-1}$ are linearly independent (which by induction hypothesis they almost surely are), then there is a unique (up to constant factor) linear dependence among the projections of all $n$ vectors into $\mathbb R^{n-1}$. Now if we let all $x_{i,j}$ fix except $x_{n,n}$ then we observe that the condition that the full vectors are linearly dependent translates into a condition that $x_n$ must be the root of a nonzero(!) polynomial, which happens with probability $0$.

(As there has been a discussion about the precise definition of the notion of continuous random variable: The above works for any definition that implies that $P(X=a)=0$ for all $a$. This is the case for example if one demands that a pdf exists)

0
On

I suppose by continuous you mean absolutely continuous, i.e. with a density with respect to Lebesgue measure (the more general case where you allow singular continuous distributions is a bit more subtle). If $F(x_1,\ldots,x_m)$ is a non-constant polynomial, the variety $F^{-1}(0)$ has $m$-dimensional Lebesgue measure $0$, so if $(X_1, \ldots, X_m)$ are random variables with an absolutely continuous joint distribution, $P\{F(X_1,\ldots,X_m) = 0\} = 0$. In your case, linear dependence of your vectors translates into the determinant of the matrix formed from these vectors being $0$.