Question
My question concerns how to go from step (1) to step (2). This is mainly a minor linear algebra/statistics problem but I've included extra details for completeness.
Starting from the definition, I get $$ \sum_{1\leq i\leq n}E[\tilde{V}_{i,n}\tilde{V}_{i,n}'|w_{i,n}]=\omega_n\sum_{1\leq i\leq n}E[\sum^n_{j=1}M_{ij,n}^2V_{j,n}V_{j,n}'|w_{i,n}] $$ which does not simplify to $M_{ii,n}$ only.
Background
The following extract came from this statistics paper (link). It showcases a heteroskedasticity robust standard error for OLS estimator under the assumption that the number of regressor grows in proportion to sample size (but less than 1).
\begin{align} \lambda_{\min}(E[\tilde{\Gamma}_n|\mathcal{W}_n]) &=\lambda_{\min}\Big(\frac{1}{n}\sum_{1\leq i\leq n}E[\tilde{V}_{i,n}\tilde{V}_{i,n}'|w_{i,n}]\Big) & (1)\\ &=\omega_n\lambda_{\min}\Big(\frac{1}{n}\sum_{1\leq i\leq n}M_{ii,n}E[V_{i,n}V_{i,n}'|w_{i,n}])& (2)\\ &\geq \omega_n \frac{1}{n}\sum_{1\leq i\leq n}M_{ii,n}\lambda_{\min}(E[V_{i,n}V_{i,n}'|w_{i,n}])\\ &\geq \omega_n\frac{(1-\frac{K_n}{n})}{C_n^{LR}} \end{align} so $\frac{1}{\lambda_{\min}(E[\tilde{\Gamma_n}|\mathcal{W}_n])}=O_p(1)$ because $P[\omega_n=1]\rightarrow 1$, $\bar{\lim}_{n\rightarrow\infty}K_n/n<1$ and $C_n^{LR}=O_p(1)$.
Notation:
- $M_{ij,n}=1(i=j)-w_{i,n}'(\sum^n_{k=1}w_{k,n}w_{k,n}')^{-1}w_{j,n}$ (note that $M_{ij,n}$ forms a orthogonal projection matrix)
- $\mathcal{W}_n=(w_{1,n},\ldots,w_{n,n})$
- $V_{i,n}=x_{i,n}-E[x_{i,n}|\mathcal{W}_n]$
- $\omega_n=1\{\lambda_{\min}(\sum^n_{k=1}w_{k,n}w_{k,n}')>0\}$
- $\tilde{V}_{i,n}=\omega_n\sum^n_{j=1}M_{ij,n}V_{j,n}$
- $\tilde{\Gamma}_n=\frac{1}{n}\sum^n_{i=1}\tilde{V}_{i,n}\tilde{V}_{i,n}'$
Assumptions
- $\{(y_{i,n},x_{i,n}',w_{i,n}'):1\leq i\leq n\}$ are iid over $i$.
- $P[\omega_n=1]\rightarrow 1$, $\bar{\lim}_{n\rightarrow\infty}K_n/n<1$ and $C_n^{LR}=O_p(1)$, where $$ C_n^{LR}=\max_{1\leq i\leq n}\Big\{ E[u_{in}^4| x_{i,n}w_{i,n}]+E[\lvert \rvert V_{i,n}\lvert \rvert^4|w_{i,n} \Big\} +\max_{1\leq i\leq n}\Big\{\frac{1}{E[u_{i,n}^2|x_{i,n}, w_{i,n}]}+\frac{1}{\lambda_{\min}(E[V_{i,n}V_{i,n}'|w_{i,n}])}\Big\} $$
- $E[\lvert \rvert x_{i,n}\lvert \rvert ^2]=O(1)$, $nE[(E[u_{i,n}|x_{i,n}w_{i,n}])^2]=o(1)$, $\max_{1\leq i\leq n}\lvert \rvert \hat{v}_{i,n}\lvert \rvert /\sqrt{n}=o_p(1)$.
To go from (1) to (2). We require the following fact $$ \sum^n_{j=1}M_{ij}^2=M_{ii}, $$ which is true because $M_n$ is a symmetric idempotent matrix.
\begin{align} \sum^n_{i=1}\tilde{V}_{i,n}\tilde{V}_{i,n}' &=\sum^n_{i=1}\sum^n_{j=1}M_{ij,n}^2V_{j,n}V_{j,n}'\\ &=\sum^n_{j=1}\sum^n_{i=1}M_{ij,n}^2V_{j,n}V_{j,n}'\\ &=\sum^n_{j=1}M_{jj,n}V_{j,n}V_{j,n}' \end{align} and rename the index $j$ to $i$.