Let's consider a toy example: $x$ and $y$ are random (column) vectors for which the covariance matrices $\operatorname{Var}(x)$ and $\operatorname{Var}(y)$ exist and are invertible. Then, $$ \operatorname{Var}(y)\succeq\operatorname{Cov}(y,x)[\operatorname{Var}(x)]^{-1}\operatorname{Cov}(x,y)\tag{$*$} $$ where $\succeq$ means the LHS matrix minus the RHS matrix is positive semidefinite and $\operatorname{Cov}(y,x)$, for example, is $E[(y-E(y))(x-E(x))']$.
($*$) is true and a proof, for instance, can be found in Tripathi (Economic Leters, 1999). My question: is the following also a valid proof?
Let $\{X_i,Y_i\}$, $i=1,\ldots,n$, be i.i.d. $\{x-E(x),y-E(y)\}$. Let $X$ be the random matrix with rows $X_1',X_2',\ldots,X_n'$ and let $Y$ be formed similarly. Then, as $n\to\infty$, $$ X'X/n\overset{\text{P}}{\to}\operatorname{Var}(x),\quad Y'Y/n\overset{\text{P}}{\to}\operatorname{Var}(y),\quad X'Y/n\overset{\text{P}}{\to}\operatorname{Cov}(x,y) $$ where $\overset{\text{P}}{\to}$ means convergence in probability. But for each finite $n$, $$ Y'Y/n-(Y'X/n)(X'X/n)^{-1}(X'Y/n)=\frac{1}{n}Y'[I_n-X(X'X)^{-1}X']Y\succeq 0\tag{$**$} $$ because the expression in the square brackets on the RHS above is an orthogonal projection matrix. So ($*$) holds by using ($**$) and letting $n\to\infty$.