This arises in Time-Series modelling. Suppose $Y_i \sim N_p(0,\Sigma_i)$ and they are not necessarily independent (but assuming $\Sigma_i$ to be p.d.). Then
for any ${a}\neq 0 \:\:\:\:$ $Y_i'a\neq0$ with probability 1.
$$ X_{2\times p}=\begin{bmatrix} Y_1' \\ Y_2' \\ \end{bmatrix}$$
My target: To show X is of full column rank(w.p. 1). Let $\alpha=(\alpha_1\:\alpha_2)'$ be such that,
$X\alpha=0\:$ w.p. 1 $\Rightarrow \begin{bmatrix} Y_1'\alpha \\ Y_2'\alpha \\ \end{bmatrix}=0\:\:$ w.p. 1 $\Rightarrow \alpha=(0\:\: 0)$.
But at this step I am feeling so perplexed! Becasue if I take $p=10$ then we have $X$ of $2\times 10$ matrix with rank=10 w.p. 1 !!! Definitely I am doing a severe mistake but unable to find it.
Motivation: Suppose $$ X_{T\times p}=\begin{bmatrix} Y_1' \\ Y_2' \\ .\\ .\\ Y_T'\\ \end{bmatrix}$$
Then $X'X$ is p.d. with probability 1 if $T\geq p$. Where the $Y_i$ have absolutely continuous distribution but may not be independent.
Can someone point me out where I'm doing wrong? Also any suggestion for random matrices study notes will be helpful for me.