Background
I have come across a problem where I want to derive a concentration inequality for sub-gaussian random variables. More precisely, I want to bound the spectrum of a certain empirical covariance matrix, where $(x_i)$ are sub-gaussian random vectors forming the rows of a matrix $X \in \mathbb{R}^{T\times n}$. Without specififying exactly what the dependence structure of the $x_i$ are, let us just say I want to bound the probability of an event of the form $$ \left\|\frac{1}{T}X^*X-\Sigma \right \|_\infty \geq f(T,\dots). $$
My argument relies on a fact which holds (more or less) uniquely for Gaussian random variables (I have a quotient of moment generating functions somewhere in my argument) and I have verified it to be true in the Gaussian case, but let us not dwell on this.
Question
Now, somehow the notion of sub-Gaussianity is meant to capture the fact the distribution is less wild than a Gaussian (in particular dispalyed by the moment generating definition so in the scalar case my question is trivial).
So, this got me thinking, is there a generic way to prove such an inequality for the Gaussian case and then use some sort of "extension/domination lemma" to give the result for the sub-Gaussian case?
I can already see one problem with such an approach since the entries of $X^*X$ really are sub-exponential and not sub-Gaussian anymore (although one can imagine them being dominated by $\chi$-squareds).
I would be very interested in any links, references, ideas, partial results that could help me better understand this problem. I am not interested in finding a different way to prove my bound, but really would like to explore this sort of "domination" argument!
You can use covering number arguments to get similar results for sub-gaussian distributions sometimes. The result that you are interested in is Theorem 4.6.1 (see Eq 4.22) in the book[1]. The book is available online and the proof uses covering number argument.
Also, Lemma 6.2.3 in [1] shows an example where the moment generating function of sub-gaussian random vector was bounded by mgf of a gaussian vector.
[1]: High-Dimensional Probability, Roman Vershynin. https://www.math.uci.edu/~rvershyn/papers/HDP-book/HDP-book.html