It is known that a random matrix $A \in \mathbb{R}^{n \times n}$ whose entries are i.i.d. and follow a standard Gaussian distribution, the following upper-bound holds
$$ \mathbb{E} [\| A \|] \leq \sqrt{2n \log (2n) } $$
by the Bernstein inequality. Does anyone know how the result would change if the Gaussian entries had a variance $\sigma$ and did not follow the standard Gaussian, i.e., variance = 1?