In linear regression model, we have $$ y=X\beta+\epsilon, $$ where $y,X$ and $\epsilon$ are random vector/matrices of dimension $n\times 1$, $n\times r$ and $n\times 1$ respectively
In computing the OLS estimator, we will ecounter the matrix $X'X$ that is symmetric and invertible where $X$ is the matrix of data. In computing the variance of OLS estimator, we often ecounter the matrix $X'\Omega X$, where $\Omega$ is the covariance matrix that is postive definite and symmetric.
Can we obtain some bounds on the eigenvalue in terms of $\lambda(X'X)$ or $\text{tr}(X'X)$ and $\lambda(\Omega)$? Currently, I can get a bound like $\lambda(X)^2 \lambda(\Omega)$ if $X$ is also symmetric.
Your upper bound can be extended to the more general $$ \lambda_{\max}(X'\Omega X) \leq \lambda_{\max}(X'X)\lambda_{\max}(\Omega). $$ The matrix $X'\Omega X$ is positive semidefinite, so all of its eigenvalues are non-negative with the above upper bound.
This can be shown as follows. Let $\|X\| = \sqrt{\lambda_{\max}(X'X)}$ denote the spectral norm of $X$. We have \begin{align} \lambda_{\max}(X'\Omega X) &= \max_{\|v\| \leq 1} v'(X'\Omega X)v = \max_{\|v\| \le 1} (Xv)'\Omega (Xv) \\ & = \|X\|^2 \max_{\|v\| \leq 1} \left(\frac{Xv}{\|X\|} \right)' \Omega \left(\frac{Xv}{\|X\|} \right) \\ & \leq \|X\|^2 \max_{\|w\| \leq 1} w'\Omega w = \|X\|^2 \cdot \lambda_{\max}(\Omega). \end{align}