Suppose we have a $l\times l$ real matrix $M=(X'X)^{-1}X'\Sigma X (X'X)^{-1}$, where $X$ is a $n\times l$ real matrix and $\Sigma$ is a $l\times l$ real symmetric and positive definite matrix. $X'X$ is invertible and its minimum eigenvalue $\lambda_{min}(X'X)$ diverges to $\infty$ as $n\rightarrow \infty$. How to show that $t'Mt\rightarrow 0$ for any vector $t$ with unit norm, i.e., $t't=1$?
I know how to show the claim when $M=(X'X)^{-1}$, in this case, $t'Mt\leq \lambda_{max}((X'X)^{-1})t't=\frac{1}{\lambda_{min}(X'X)}\rightarrow 0$. Seems that a similar argument do not apply now because we cannot no longer single out the eigenvalue of $X'X$ or $(X'X)^{-1}$.
In the proof, for a vector $x$, we let $\|x\|$ denote the length of $x$, so $x'x=\|x\|^2$. Note that for any matrix $D$ and any vector $x$ such that $Dx$ is defined, $xD'Dx=(Dx)'Dx=\|Dx\|^2$.
We can write $$\Sigma=T'T$$ for some $l\times l$ matrix $T$. Define $$A=X(X'X)^{-1}$$ and $$B=TA.$$ Note that $$B'B=A'T'TA=(X'X)^{-1}X'\Sigma X(X'X)^{-1}=M$$ and $$A'A=[(X'X)^{-1}X'][X(X'X)^{-1}]=(X'X)^{-1}.$$ Then for $t$ with $t't=1$, we have \begin{align*} t'Mt & =(Bt)'(Bt)=\|Bt\|^2\leqslant \|T\|^2 \|At\|^2 = \|T\|^2(At)'(At) \\ & = \|T\|^2(t'A'At) = \|T\|^2 (t'(X'X)^{-1}t).\end{align*}
Since $T$ does not depend on $n$, the rest of the proof is handled by the case you presented in the question.
EDIT: For fixed $r,s$, let $\mathcal{M}_{r,s}$ denote the space of all $r\times s$ matrices with entries in $\mathbb{R}$. Everything we say below is for $\mathbb{R}$, but it also holds for $\mathbb{C}$, except instead of the transepose, we need the conjugate transpose. Let's look at 1. trace duality, and 2. a few different norms.
We can consider $\mathbb{R}^n$ to be equal to $\mathcal{M}_{n,1}$ ($n$ rows, $1$ column).
Recall $\|\cdot\|_1$, $\|\cdot\|_2$, and $\|\cdot\|_\infty$ on $\mathbb{R}^n$ given by $$\|x\|_1=\sum_{i=1}^n |x_i|,$$ $$\|x\|_2=\Bigl(\sum_{i=1}^n |x_i|^2\Bigr)^{1/2},$$ and $$\|x\|_\infty=\max_{1\leqslant i\leqslant n}|x_i|.$$ We can check that $$\|x\|_\infty\leqslant \|x\|_2\leqslant \|x\|_1.$$ Note that if $p=q=2$, or $p=1$ and $q=\infty$, or $p=\infty$ and $q=1$, then $$\|x\|_p=\max\{|y'x|:\|y\|_q=1\},$$ where $y'$ denotes the transpose and $y'x$ denotes the dot product. Technically, $y'x$ is a $1\times 1$ matrix whose single entry is the dot product of $x$ and $y$, but the notation $y'x$ to denote the dot product is common. If we wanted to be precise, we could write $\text{trace}(y'x)$ instead of $y'x$, since the trace of a $1\times 1$ matrix is just its single entry. We also note that $$\|x\|_2^2=x'x.$$ Or, if we want to be precise, $$\|x\|_2^2=\text{trace}(x'x).$$
Consider $T\in \mathcal{M}_{m,n}$ as an operator from $\mathbb{R}^n$ to $\mathbb{R}^m$ with action given by matrix multiplication, so $Tx$ is just the matrix product of $T$ with $x\in \mathbb{R}^n$. We define the operator norm $\|T\|_\infty$ by $$\|T\|_\infty=\sup\{\|Tx\|_2:\|x\|_2=1\}.$$ We define the Frobenius norm of $T$ by $$\|T\|_2=\text{trace}(T'T),$$ where again, $T'$ denotes the transpose. Last, we define the trace class norm $\|T\|_1$ by $$\|T\|_1=\max\{\text{trace}(S'T):\|S\|_\infty=1\}.$$ We note that these definitions exactly generalize those for the norms on $\mathbb{R}^n$. They're not just analogous. The norms on $\mathbb{R}^n$ are, in fact, just the $\mathcal{M}_{n,1}$ case of the norms we've defined here. We still have $$\|T\|_\infty\leqslant \|T\|_2\leqslant \|T\|_1$$ and, if $$(p,q)\in \{(2,2), (1,\infty),(\infty,1)\},$$ $$\|T\|_p=\max\{\text{trace}(S'T):\|S\|_q=1\}.$$ This is called trace duality.
Let $S,T$ be two members of $\mathcal{M}_{m,n}$. Let $S^i_j$ denote the row $i$, column $j$ entry of $S$. Similarly, let $T^i_j$ denote the row $i$, column $j$ entry of $T$. Then $$\text{trace}(S'T)=\sum_{i=1}^m\sum_{j=1}^n S^i_j T^i_j.$$ We can see that this is more or less a direct analogue of the dot product, so it's not surprising that they generalize the properties of the dot product and duality from $\mathbb{R}^n$.
Another way to generalize the norms to $\mathcal{M}_{m,n}$ which do not go through trace duality would be $$|T|_\infty=\max_{i,j}|T^i_j|$$ $$|T|_1=\sum_{i,j}|T^i_j|.$$ It is true that $$|T|_\infty\leqslant \|T\|_\infty \leqslant \|T\|_2\leqslant \|T\|_1\leqslant |T|_1.$$ So we can think about the $\|\cdot\|_\infty$ norm as being "interpolated" between $|\cdot|_\infty$ and $\|\cdot\|_2$. The norm $\|T\|_\infty$ can often be difficult to calculate, but a little bit of intuition is that, "along rows", the $\|\cdot\|_\infty$ norm behaves like the $\ell_2$ norm, while "along the diagonal", the $\|\cdot\|_\infty$ norm behaves like $\ell_\infty$. Similarly (or, dually), $\|\cdot\|_1$ behaves like $\ell_2$ on rows and like $\ell_1$ on diagonals.
By definition of $\|T\|_\infty$ and homogeneity, $$\|Tx\|_2\leqslant \|T\|_\infty \|x\|_2,$$ which is the relevant inequality to the original problem. Let's last address the question of how we know $\|T\|_\infty$ is finite for a fixed $l\times l$ matrix. Let $T^i$ denote the $i^{th}$ row of $T$ and fix $x\in \mathbb{R}^n$ with $\|x\|_2=1$. These are row vectors, and, using our preceding $T^i_j$ notation, $$T^i=\begin{pmatrix} T^i_1 & T^i_2 & \ldots & T^i_n\end{pmatrix}.$$ We want to get an upper estimate on $\|Tx\|_2$. Note that $$Tx=\begin{pmatrix} x'T^1 \\ x'T^2 \\ \vdots \\ x'T^m\end{pmatrix},$$ so $$\|Tx\|_2^2=\sum_{i=1}^m |x'T^i|^2.$$ By the Cauchy-Schwarz inequality, and using the fact that $\|x\|_2=1$, $$|x'T^i|\leqslant \|x\|_2\|T^i\|_2 = \|T^i\|_2.$$ Therefore $$\|Tx\|_2^2=\sum_{i=1}^m |x'T^i|^2\leqslant \sum_{i=1}^m \|T^i\|_2^2.$$ But one checks by direct calculation that $$\sum_{i=1}^m \|T^i\|_2^2=\|T\|_2.$$ Therefore $\|T\|_\infty\leqslant \|T\|_2$. Since $$\|T\|_2^2=\sum_{i,j}|T^i_j|^2,$$ this is finite.