Random vectors and basis

495 Views Asked by At

enter image description here

Can someone give me some hints? The only thing I can think of is to prove that x1,...xn are linearly independent, but I have no idea about how to do it. Also, how to use the RANDOM VECTOR THEOREM? What does it exactly mean? How it works?

2

There are 2 best solutions below

5
On BEST ANSWER

Heuristically speaking, relative to the standard bases on $\mathbb{R}^n$ and $\mathbb{R}^m$, simply observe that in order for the random vector $x=(x_1,\ldots,x_n)\in\mathbb{R}^n$ to also be an element of $\mathbb{R}^m$ with $m<n$, then the last $n-m$ coordinates of $x$ would have to be $0$. But $$ \Pr(x_i=0)=0,\;\forall 1\le i\le n $$ and the result follows.

Now if you have a collection of $n$ vectors $\{x_1,\ldots,x_n\}$ that are linearly dependent, then there is at least one vector $x_i$ such that $$ x_i=a_1x_1+\cdots+a_{i-1}x_{i-1}+a_{i+1}x_{i+1}+\cdots+a_nx_n. $$ Similar to our argument in the original theorem, $$ \Pr(\text{a random vector} = x_i)=0. $$

0
On

For i). $\mathcal{V}$ is defined by $n-m$ linear equations. Consider one amongst them, that we may write $x_1=\sum_{i>1} a_ix_i$. Let $v=(v_i)\in \mathbb{R}^n$. Assume that we fixed $v_2,\cdots,v_n$ and $v_1$ is free. The probabiliy that $v_1=\sum_{i>1}a_iv_i$ is $0$.

For ii). Let $v_1,\cdots,v_n$ where $v_k=(v_{k,i})$. We consider the matrix $A=[v_1,\cdots,v_n]$ written in the canonical basis. We suppose that this system has rank $k<n$. We may assume that the submatrix $(1..k,1..k)$ (denoted by $B$) is invertible and the submatrix $(1..k+1,1..k+1)$ (denoted by $C$) is not. Assume that we fixed $v_1,\cdots,v_k,v_{k+1,1},\cdots,v_{k+1,k}$ and $v_{k+1,k+1}$ is free. Then $\det(C)=v_{k+1,k+1}\det(B)+\sum_{i\leq k}a_iv_{k+1,i}$. The probability that $v_{k+1,k+1}=\dfrac{-\sum_{i\leq k}a_iv_{k+1,i}}{\det(B)}$ is $0$.