Recently I have encountered an paper which impose a subGaussian condition on the covariate vector $\boldsymbol{x}$ as follows:
There exists some universal constant $\sigma_x \in[1, \infty)$ such that $$ \forall \boldsymbol{v} \in \mathbb{R}^p, \quad \mathbb{E}\left[\exp \left\{\boldsymbol{v}^{\top} \boldsymbol{\Sigma}^{-1 / 2} \boldsymbol{x}\right\}\right] \leq \exp \left(\frac{\sigma_x^2}{2} \cdot\|\boldsymbol{v}\|_2^2\right), $$ where $\boldsymbol{\Sigma}$ is the covariance matrix of $\boldsymbol{x}$.
Meanwhile, it imposes subGaussian condition on the error as follows:
There exists some universal constant $\sigma_{\varepsilon} \in \mathbb{R}^{+}$such that, $$ \forall \lambda \in \mathbb{R}, \quad \mathbb{E}\left[e^{\lambda \varepsilon}\right] \leq e^{\frac{1}{2} \lambda^2 \sigma_{\varepsilon}^2}. $$
My confusion: In the first condition, the subGaussian parameter $\sigma_x$ is related with the vector $\boldsymbol{\Sigma}^{-1/2}\boldsymbol{x}$ instead of $\boldsymbol{x}$. In the second condition, the subGaussian parameter $\sigma_{\varepsilon}$ is related with the random variable $\varepsilon$ instead of var($\varepsilon)^{-1/2}\varepsilon$.
My question: I understand there is a relationship between the variance and subGaussian parameter. I read from here that
$$ X \in \mathcal{S G}\left(\sigma^2\right) \Longrightarrow \operatorname{Var}[X] \leq \sigma^2 $$ In fact, $\operatorname{Var}[X] \leq \sigma^2(X)$ where $\sigma^2(X):=\inf \left\{\sigma^2: \mathbb{E}\left[e^{\lambda(X-\mu)}\right] \leq e^{\lambda^2 \sigma^2 / 2}\right\}$
When I impose a condition about subGaussian random variable (vector), is there a rule/criterion that tell me whether or not I should scale it first with its variance?