Bounding the largest singular value of a random matrix with entries taken i.i.d from $N(0,\sigma^2)$

432 Views Asked by At

Given an $m \times n$ matrix $X$ whose entries are i.i.d from a Gaussian distribution $N(0,\sigma^2) $ with mean 0 and variance $\sigma^2$, I want to find a bound on the spectral norm of $X$, i.e. the largest singular value of $X$, in terms of $\sigma,m,n$ with high probability.

I found this MSE post which talks about this case with $\sigma^2 = 1/n$, but I'm looking for a more general case. They also mention one can show the bound via Bernstein Inequality, but I'm not sure how that would work, if it does also work for this case. Any explanation of this would also be great.

I don't require a full proof. I mainly am curious about an answer and any direct source I can look at that specifically mentions this problem.

Thanks!

EDIT: I'm looking for a result similar to Theorem 2.6 in these notes or Theorem 5.32 in this paper which deal with the same question but for the the largest singular value of random matrix with entries in the standard Gaussian $N(0,1)$, bounded by $\sqrt{m} + \sqrt{n}$. I don't know if these references help but I thought I'd add them in for the kind of result I need.