Suppose $E_i$'s are iid random matrices (not square) such that $\mathbb{E}[E_i] = 0$ and $\mathbb{E}[E_i^TE_j] = 0$ for $i\ne j$.
Let $M = \sum_{i=1}^n E_i$. It is clear that $$ H^* = \mathbb{E}[M^TM] = \sum_{i=1}^n \mathbb{E}[E_i^TE_i]. $$ Suppose further that $0 < \lambda_{\min}(H^*) \le \lambda_{\max}(H^*) < \infty$.
I would like to have a "Matrix Chernoff" like inequality: $$ P\left[ \lambda_{\min}(M^TM) > (1-\epsilon)\lambda_{\min}(H^*) \right] > 1 - g(\epsilon). $$
However, the standard Matrix Chernoff cannot apply because $M^TM$ is not a sum of independent Hermitian matrices.
Note that $$ M^TM = G + Y, \qquad Y = \sum_{i \ne j} E_i^TE_j, \qquad G = \sum_{i=1}^n E_i^TE_i. $$ But I feel that this can as the contribution of $Y$ to $M^TM$ would be small. However, since $Y$ is no longer positive definite (although symmetric), I am not sure how to proceed.
Any helps/comments/answers will be very appreciated.